[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15621 1726882567.22287: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-AQL executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15621 1726882567.23211: Added group all to inventory 15621 1726882567.23213: Added group ungrouped to inventory 15621 1726882567.23218: Group all now contains ungrouped 15621 1726882567.23221: Examining possible inventory source: /tmp/network-mVt/inventory.yml 15621 1726882567.44696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15621 1726882567.44764: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15621 1726882567.44790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15621 1726882567.44857: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15621 1726882567.44939: Loaded config def from plugin (inventory/script) 15621 1726882567.44941: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15621 1726882567.44987: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15621 1726882567.45086: Loaded config def from plugin (inventory/yaml) 15621 1726882567.45089: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15621 1726882567.45185: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15621 1726882567.45673: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15621 1726882567.45676: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15621 1726882567.45680: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15621 1726882567.45686: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15621 1726882567.45691: Loading data from /tmp/network-mVt/inventory.yml 15621 1726882567.45767: /tmp/network-mVt/inventory.yml was not parsable by auto 15621 1726882567.45841: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15621 1726882567.45886: Loading data from /tmp/network-mVt/inventory.yml 15621 1726882567.45980: group all already in inventory 15621 1726882567.45988: set inventory_file for managed_node1 15621 1726882567.45992: set inventory_dir for managed_node1 15621 1726882567.45993: Added host managed_node1 to inventory 15621 1726882567.45996: Added host managed_node1 to group all 15621 1726882567.45997: set ansible_host for managed_node1 15621 1726882567.45998: set ansible_ssh_extra_args for managed_node1 15621 1726882567.46001: set inventory_file for managed_node2 15621 1726882567.46004: set inventory_dir for managed_node2 15621 1726882567.46004: Added host managed_node2 to inventory 15621 1726882567.46006: Added host managed_node2 to group all 15621 1726882567.46007: set ansible_host for managed_node2 15621 1726882567.46008: set ansible_ssh_extra_args for managed_node2 15621 1726882567.46010: set inventory_file for managed_node3 15621 1726882567.46013: set inventory_dir for managed_node3 15621 1726882567.46014: Added host managed_node3 to inventory 15621 1726882567.46015: Added host managed_node3 to group all 15621 1726882567.46016: set ansible_host for managed_node3 15621 1726882567.46017: set ansible_ssh_extra_args for managed_node3 15621 1726882567.46019: Reconcile groups and hosts in inventory. 15621 1726882567.46025: Group ungrouped now contains managed_node1 15621 1726882567.46028: Group ungrouped now contains managed_node2 15621 1726882567.46029: Group ungrouped now contains managed_node3 15621 1726882567.46109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15621 1726882567.46249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15621 1726882567.46304: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15621 1726882567.46336: Loaded config def from plugin (vars/host_group_vars) 15621 1726882567.46338: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15621 1726882567.46345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15621 1726882567.46354: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15621 1726882567.46403: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15621 1726882567.46749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882567.46851: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15621 1726882567.46899: Loaded config def from plugin (connection/local) 15621 1726882567.46902: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15621 1726882567.47654: Loaded config def from plugin (connection/paramiko_ssh) 15621 1726882567.47657: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15621 1726882567.48634: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15621 1726882567.48674: Loaded config def from plugin (connection/psrp) 15621 1726882567.48677: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15621 1726882567.49616: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15621 1726882567.49664: Loaded config def from plugin (connection/ssh) 15621 1726882567.49672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15621 1726882567.52088: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15621 1726882567.52144: Loaded config def from plugin (connection/winrm) 15621 1726882567.52148: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15621 1726882567.52183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15621 1726882567.52262: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15621 1726882567.52350: Loaded config def from plugin (shell/cmd) 15621 1726882567.52352: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15621 1726882567.52381: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15621 1726882567.52466: Loaded config def from plugin (shell/powershell) 15621 1726882567.52469: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15621 1726882567.52531: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15621 1726882567.52757: Loaded config def from plugin (shell/sh) 15621 1726882567.52759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15621 1726882567.52810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15621 1726882567.52964: Loaded config def from plugin (become/runas) 15621 1726882567.52967: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15621 1726882567.53195: Loaded config def from plugin (become/su) 15621 1726882567.53198: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15621 1726882567.53587: Loaded config def from plugin (become/sudo) 15621 1726882567.53590: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15621 1726882567.53628: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 15621 1726882567.54397: in VariableManager get_vars() 15621 1726882567.54421: done with get_vars() 15621 1726882567.54878: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15621 1726882567.61572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15621 1726882567.61818: in VariableManager get_vars() 15621 1726882567.61929: done with get_vars() 15621 1726882567.61933: variable 'playbook_dir' from source: magic vars 15621 1726882567.61934: variable 'ansible_playbook_python' from source: magic vars 15621 1726882567.61935: variable 'ansible_config_file' from source: magic vars 15621 1726882567.61936: variable 'groups' from source: magic vars 15621 1726882567.61937: variable 'omit' from source: magic vars 15621 1726882567.61937: variable 'ansible_version' from source: magic vars 15621 1726882567.61938: variable 'ansible_check_mode' from source: magic vars 15621 1726882567.61939: variable 'ansible_diff_mode' from source: magic vars 15621 1726882567.61940: variable 'ansible_forks' from source: magic vars 15621 1726882567.61941: variable 'ansible_inventory_sources' from source: magic vars 15621 1726882567.61942: variable 'ansible_skip_tags' from source: magic vars 15621 1726882567.61942: variable 'ansible_limit' from source: magic vars 15621 1726882567.61943: variable 'ansible_run_tags' from source: magic vars 15621 1726882567.61944: variable 'ansible_verbosity' from source: magic vars 15621 1726882567.61985: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 15621 1726882567.63317: in VariableManager get_vars() 15621 1726882567.63339: done with get_vars() 15621 1726882567.63384: in VariableManager get_vars() 15621 1726882567.63413: done with get_vars() 15621 1726882567.63461: in VariableManager get_vars() 15621 1726882567.63475: done with get_vars() 15621 1726882567.63513: in VariableManager get_vars() 15621 1726882567.63528: done with get_vars() 15621 1726882567.63621: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15621 1726882567.63871: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15621 1726882567.64043: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15621 1726882567.64839: in VariableManager get_vars() 15621 1726882567.64861: done with get_vars() 15621 1726882567.65384: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15621 1726882567.65553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882567.69167: in VariableManager get_vars() 15621 1726882567.69191: done with get_vars() 15621 1726882567.69753: in VariableManager get_vars() 15621 1726882567.69757: done with get_vars() 15621 1726882567.69760: variable 'playbook_dir' from source: magic vars 15621 1726882567.69761: variable 'ansible_playbook_python' from source: magic vars 15621 1726882567.69762: variable 'ansible_config_file' from source: magic vars 15621 1726882567.69762: variable 'groups' from source: magic vars 15621 1726882567.69763: variable 'omit' from source: magic vars 15621 1726882567.69764: variable 'ansible_version' from source: magic vars 15621 1726882567.69765: variable 'ansible_check_mode' from source: magic vars 15621 1726882567.69765: variable 'ansible_diff_mode' from source: magic vars 15621 1726882567.69766: variable 'ansible_forks' from source: magic vars 15621 1726882567.69767: variable 'ansible_inventory_sources' from source: magic vars 15621 1726882567.69768: variable 'ansible_skip_tags' from source: magic vars 15621 1726882567.69768: variable 'ansible_limit' from source: magic vars 15621 1726882567.69772: variable 'ansible_run_tags' from source: magic vars 15621 1726882567.69773: variable 'ansible_verbosity' from source: magic vars 15621 1726882567.69812: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15621 1726882567.70099: in VariableManager get_vars() 15621 1726882567.70103: done with get_vars() 15621 1726882567.70105: variable 'playbook_dir' from source: magic vars 15621 1726882567.70106: variable 'ansible_playbook_python' from source: magic vars 15621 1726882567.70107: variable 'ansible_config_file' from source: magic vars 15621 1726882567.70108: variable 'groups' from source: magic vars 15621 1726882567.70109: variable 'omit' from source: magic vars 15621 1726882567.70109: variable 'ansible_version' from source: magic vars 15621 1726882567.70110: variable 'ansible_check_mode' from source: magic vars 15621 1726882567.70111: variable 'ansible_diff_mode' from source: magic vars 15621 1726882567.70112: variable 'ansible_forks' from source: magic vars 15621 1726882567.70112: variable 'ansible_inventory_sources' from source: magic vars 15621 1726882567.70113: variable 'ansible_skip_tags' from source: magic vars 15621 1726882567.70114: variable 'ansible_limit' from source: magic vars 15621 1726882567.70115: variable 'ansible_run_tags' from source: magic vars 15621 1726882567.70116: variable 'ansible_verbosity' from source: magic vars 15621 1726882567.70355: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15621 1726882567.70653: in VariableManager get_vars() 15621 1726882567.70666: done with get_vars() 15621 1726882567.70719: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15621 1726882567.71181: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15621 1726882567.71265: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15621 1726882567.72438: in VariableManager get_vars() 15621 1726882567.72461: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882567.76037: in VariableManager get_vars() 15621 1726882567.76059: done with get_vars() 15621 1726882567.76104: in VariableManager get_vars() 15621 1726882567.76107: done with get_vars() 15621 1726882567.76110: variable 'playbook_dir' from source: magic vars 15621 1726882567.76111: variable 'ansible_playbook_python' from source: magic vars 15621 1726882567.76112: variable 'ansible_config_file' from source: magic vars 15621 1726882567.76112: variable 'groups' from source: magic vars 15621 1726882567.76113: variable 'omit' from source: magic vars 15621 1726882567.76114: variable 'ansible_version' from source: magic vars 15621 1726882567.76115: variable 'ansible_check_mode' from source: magic vars 15621 1726882567.76115: variable 'ansible_diff_mode' from source: magic vars 15621 1726882567.76116: variable 'ansible_forks' from source: magic vars 15621 1726882567.76117: variable 'ansible_inventory_sources' from source: magic vars 15621 1726882567.76118: variable 'ansible_skip_tags' from source: magic vars 15621 1726882567.76119: variable 'ansible_limit' from source: magic vars 15621 1726882567.76119: variable 'ansible_run_tags' from source: magic vars 15621 1726882567.76120: variable 'ansible_verbosity' from source: magic vars 15621 1726882567.76464: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15621 1726882567.76548: in VariableManager get_vars() 15621 1726882567.76560: done with get_vars() 15621 1726882567.76606: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15621 1726882567.81144: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15621 1726882567.81237: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15621 1726882567.81902: in VariableManager get_vars() 15621 1726882567.82230: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882567.85830: in VariableManager get_vars() 15621 1726882567.85848: done with get_vars() 15621 1726882567.85892: in VariableManager get_vars() 15621 1726882567.85906: done with get_vars() 15621 1726882567.85980: in VariableManager get_vars() 15621 1726882567.85993: done with get_vars() 15621 1726882567.86304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15621 1726882567.86320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15621 1726882567.86777: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15621 1726882567.87171: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15621 1726882567.87174: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15621 1726882567.87207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15621 1726882567.87439: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15621 1726882567.87834: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15621 1726882567.87902: Loaded config def from plugin (callback/default) 15621 1726882567.87904: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15621 1726882567.91259: Loaded config def from plugin (callback/junit) 15621 1726882567.91263: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15621 1726882567.91325: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15621 1726882567.91590: Loaded config def from plugin (callback/minimal) 15621 1726882567.91593: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15621 1726882567.91944: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15621 1726882567.92021: Loaded config def from plugin (callback/tree) 15621 1726882567.92026: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15621 1726882567.92502: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15621 1726882567.92505: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 15621 1726882567.92537: in VariableManager get_vars() 15621 1726882567.92555: done with get_vars() 15621 1726882567.92789: in VariableManager get_vars() 15621 1726882567.92800: done with get_vars() 15621 1726882567.92805: variable 'omit' from source: magic vars 15621 1726882567.92855: in VariableManager get_vars() 15621 1726882567.92874: done with get_vars() 15621 1726882567.93127: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 15621 1726882567.95063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15621 1726882567.95534: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15621 1726882567.95744: getting the remaining hosts for this loop 15621 1726882567.95746: done getting the remaining hosts for this loop 15621 1726882567.95750: getting the next task for host managed_node3 15621 1726882567.95754: done getting next task for host managed_node3 15621 1726882567.95756: ^ task is: TASK: Gathering Facts 15621 1726882567.95758: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882567.95766: getting variables 15621 1726882567.95767: in VariableManager get_vars() 15621 1726882567.95783: Calling all_inventory to load vars for managed_node3 15621 1726882567.95787: Calling groups_inventory to load vars for managed_node3 15621 1726882567.95790: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882567.95804: Calling all_plugins_play to load vars for managed_node3 15621 1726882567.95817: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882567.95821: Calling groups_plugins_play to load vars for managed_node3 15621 1726882567.96053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882567.96335: done with get_vars() 15621 1726882567.96344: done getting variables 15621 1726882567.96538: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 21:36:07 -0400 (0:00:00.044) 0:00:00.044 ****** 15621 1726882567.96564: entering _queue_task() for managed_node3/gather_facts 15621 1726882567.96565: Creating lock for gather_facts 15621 1726882567.98142: worker is 1 (out of 1 available) 15621 1726882567.98150: exiting _queue_task() for managed_node3/gather_facts 15621 1726882567.98162: done queuing things up, now waiting for results queue to drain 15621 1726882567.98165: waiting for pending results... 15621 1726882567.98289: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882567.98629: in run() - task 0affc7ec-ae25-af1a-5b92-00000000007c 15621 1726882567.98633: variable 'ansible_search_path' from source: unknown 15621 1726882567.98678: calling self._execute() 15621 1726882567.98830: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882567.98833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882567.98836: variable 'omit' from source: magic vars 15621 1726882567.99085: variable 'omit' from source: magic vars 15621 1726882567.99139: variable 'omit' from source: magic vars 15621 1726882567.99268: variable 'omit' from source: magic vars 15621 1726882567.99385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882567.99480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882567.99508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882567.99570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882567.99701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882567.99705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882567.99707: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882567.99710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882567.99953: Set connection var ansible_connection to ssh 15621 1726882567.99998: Set connection var ansible_shell_executable to /bin/sh 15621 1726882568.00039: Set connection var ansible_timeout to 10 15621 1726882568.00047: Set connection var ansible_shell_type to sh 15621 1726882568.00128: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882568.00131: Set connection var ansible_pipelining to False 15621 1726882568.00329: variable 'ansible_shell_executable' from source: unknown 15621 1726882568.00332: variable 'ansible_connection' from source: unknown 15621 1726882568.00335: variable 'ansible_module_compression' from source: unknown 15621 1726882568.00338: variable 'ansible_shell_type' from source: unknown 15621 1726882568.00340: variable 'ansible_shell_executable' from source: unknown 15621 1726882568.00343: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882568.00345: variable 'ansible_pipelining' from source: unknown 15621 1726882568.00347: variable 'ansible_timeout' from source: unknown 15621 1726882568.00352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882568.00930: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882568.00934: variable 'omit' from source: magic vars 15621 1726882568.00937: starting attempt loop 15621 1726882568.00940: running the handler 15621 1726882568.00942: variable 'ansible_facts' from source: unknown 15621 1726882568.00945: _low_level_execute_command(): starting 15621 1726882568.00949: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882568.02075: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.02103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882568.02309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.02470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.02575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882568.04367: stdout chunk (state=3): >>>/root <<< 15621 1726882568.04540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.04617: stderr chunk (state=3): >>><<< 15621 1726882568.04826: stdout chunk (state=3): >>><<< 15621 1726882568.04833: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882568.04836: _low_level_execute_command(): starting 15621 1726882568.04839: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071 `" && echo ansible-tmp-1726882568.0475202-15644-105596870974071="` echo /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071 `" ) && sleep 0' 15621 1726882568.06110: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.06134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882568.06337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.06480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.06617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882568.08602: stdout chunk (state=3): >>>ansible-tmp-1726882568.0475202-15644-105596870974071=/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071 <<< 15621 1726882568.08740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.09127: stderr chunk (state=3): >>><<< 15621 1726882568.09131: stdout chunk (state=3): >>><<< 15621 1726882568.09134: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882568.0475202-15644-105596870974071=/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882568.09137: variable 'ansible_module_compression' from source: unknown 15621 1726882568.09139: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15621 1726882568.09142: ANSIBALLZ: Acquiring lock 15621 1726882568.09144: ANSIBALLZ: Lock acquired: 140146888266560 15621 1726882568.09146: ANSIBALLZ: Creating module 15621 1726882568.66471: ANSIBALLZ: Writing module into payload 15621 1726882568.66861: ANSIBALLZ: Writing module 15621 1726882568.66909: ANSIBALLZ: Renaming module 15621 1726882568.66926: ANSIBALLZ: Done creating module 15621 1726882568.66972: variable 'ansible_facts' from source: unknown 15621 1726882568.67330: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882568.67334: _low_level_execute_command(): starting 15621 1726882568.67337: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15621 1726882568.68746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882568.68791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882568.68813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.68833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.69020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882568.70800: stdout chunk (state=3): >>>PLATFORM <<< 15621 1726882568.70894: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 <<< 15621 1726882568.70917: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15621 1726882568.71159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.71162: stdout chunk (state=3): >>><<< 15621 1726882568.71165: stderr chunk (state=3): >>><<< 15621 1726882568.71214: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882568.71236 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15621 1726882568.71433: _low_level_execute_command(): starting 15621 1726882568.71605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15621 1726882568.71731: Sending initial data 15621 1726882568.71937: Sent initial data (1181 bytes) 15621 1726882568.72289: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.72301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882568.72315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882568.72338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882568.72354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882568.72365: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882568.72384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882568.72402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882568.72413: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882568.72496: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882568.72526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882568.72558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.72673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882568.77729: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 15621 1726882568.78341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.78437: stderr chunk (state=3): >>><<< 15621 1726882568.78447: stdout chunk (state=3): >>><<< 15621 1726882568.78474: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882568.78587: variable 'ansible_facts' from source: unknown 15621 1726882568.78603: variable 'ansible_facts' from source: unknown 15621 1726882568.78617: variable 'ansible_module_compression' from source: unknown 15621 1726882568.78704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882568.78708: variable 'ansible_facts' from source: unknown 15621 1726882568.78930: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py 15621 1726882568.79167: Sending initial data 15621 1726882568.79170: Sent initial data (154 bytes) 15621 1726882568.79915: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.80047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.80074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.80248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882568.82214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882568.82300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882568.82384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsarc86op /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py <<< 15621 1726882568.82393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py" <<< 15621 1726882568.82581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsarc86op" to remote "/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py" <<< 15621 1726882568.85304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.85352: stderr chunk (state=3): >>><<< 15621 1726882568.85735: stdout chunk (state=3): >>><<< 15621 1726882568.85739: done transferring module to remote 15621 1726882568.85741: _low_level_execute_command(): starting 15621 1726882568.85744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/ /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py && sleep 0' 15621 1726882568.87136: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.87249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.87287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.87687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882568.90249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882568.90368: stderr chunk (state=3): >>><<< 15621 1726882568.90372: stdout chunk (state=3): >>><<< 15621 1726882568.90375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15621 1726882568.90380: _low_level_execute_command(): starting 15621 1726882568.90382: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/AnsiballZ_setup.py && sleep 0' 15621 1726882568.91303: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882568.91328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882568.91387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882568.91397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882568.91413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882568.91472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882568.91533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882568.91591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882568.91616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882568.91743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882568.95053: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15621 1726882568.95133: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15621 1726882568.95196: stdout chunk (state=3): >>>import 'posix' # <<< 15621 1726882568.95382: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882568.95385: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 15621 1726882568.95461: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15621 1726882568.95553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87275a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727573b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87275a6ab0> import '_signal' # import '_abc' # <<< 15621 1726882568.95837: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # <<< 15621 1726882568.95840: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15621 1726882568.95842: stdout chunk (state=3): >>>import 'os' # <<< 15621 1726882568.95853: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15621 1726882568.95932: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15621 1726882568.95956: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273791c0> <<< 15621 1726882568.96011: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15621 1726882568.96036: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872737a000> <<< 15621 1726882568.96076: stdout chunk (state=3): >>>import 'site' # <<< 15621 1726882568.96110: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15621 1726882568.96782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15621 1726882568.96824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15621 1726882568.96974: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15621 1726882568.96992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b7dd0> <<< 15621 1726882568.97015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15621 1726882568.97041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15621 1726882568.97080: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b7fe0> <<< 15621 1726882568.97097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15621 1726882568.97141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15621 1726882568.97208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882568.97237: stdout chunk (state=3): >>>import 'itertools' # <<< 15621 1726882568.97279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 15621 1726882568.97303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273ef800> <<< 15621 1726882568.97435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273efe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273cfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273cd1c0> <<< 15621 1726882568.97601: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b4f80> <<< 15621 1726882568.97652: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15621 1726882568.97827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15621 1726882568.97854: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274137d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274123f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273ce090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727410b00> <<< 15621 1726882568.98142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b4230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8727440d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8727440fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b2d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727441670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727441370> import 'importlib.machinery' # <<< 15621 1726882568.98195: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15621 1726882568.98301: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727442570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15621 1726882568.98330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745c7a0> <<< 15621 1726882568.98354: stdout chunk (state=3): >>>import 'errno' # <<< 15621 1726882568.98408: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882568.98649: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745dee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15621 1726882568.98663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745ed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745f3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745e2d0> <<< 15621 1726882568.98740: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274425d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15621 1726882568.98764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15621 1726882568.98790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15621 1726882568.98830: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872719fce0> <<< 15621 1726882568.98876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15621 1726882568.98895: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882568.99032: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c84a0><<< 15621 1726882568.99044: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c8680> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872719de80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15621 1726882568.99116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15621 1726882568.99146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15621 1726882568.99166: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c9f70> <<< 15621 1726882568.99337: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c8bf0> <<< 15621 1726882568.99340: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727442750> <<< 15621 1726882568.99343: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15621 1726882568.99392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15621 1726882568.99438: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271f6300> <<< 15621 1726882568.99545: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882568.99560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15621 1726882568.99575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15621 1726882568.99794: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720e480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15621 1726882568.99797: stdout chunk (state=3): >>>import 'ntpath' # <<< 15621 1726882568.99830: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872724b260> <<< 15621 1726882568.99896: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15621 1726882568.99909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15621 1726882568.99937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15621 1726882568.99990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15621 1726882569.00241: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727269a00> <<< 15621 1726882569.00244: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872724b380> <<< 15621 1726882569.00296: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720f110> <<< 15621 1726882569.00338: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270443b0> <<< 15621 1726882569.00362: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720d4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271caea0> <<< 15621 1726882569.00624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15621 1726882569.00648: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8727044650> <<< 15621 1726882569.00935: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_d7ygo3bx/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15621 1726882569.01330: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15621 1726882569.01392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15621 1726882569.01429: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270ae180> <<< 15621 1726882569.01496: stdout chunk (state=3): >>>import '_typing' # <<< 15621 1726882569.01759: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727085070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270841d0> # zipimport: zlib available <<< 15621 1726882569.01851: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15621 1726882569.01888: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15621 1726882569.04265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.06339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15621 1726882569.06628: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727087fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270ddbb0> <<< 15621 1726882569.06649: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd9d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270aeba0> <<< 15621 1726882569.06727: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270de900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270deb40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15621 1726882569.06786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15621 1726882569.06801: stdout chunk (state=3): >>>import '_locale' # <<< 15621 1726882569.06866: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270deff0> <<< 15621 1726882569.06881: stdout chunk (state=3): >>>import 'pwd' # <<< 15621 1726882569.06902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15621 1726882569.06925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15621 1726882569.07246: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f40dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f429f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f433b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f44590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15621 1726882569.07331: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f46ff0> <<< 15621 1726882569.07378: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f47110> <<< 15621 1726882569.07402: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f452b0> <<< 15621 1726882569.07591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15621 1726882569.07595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15621 1726882569.07608: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f4aed0> import '_tokenize' # <<< 15621 1726882569.07705: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f499a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f49700> <<< 15621 1726882569.07726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15621 1726882569.07745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15621 1726882569.07859: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f4bf50> <<< 15621 1726882569.07892: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f457c0> <<< 15621 1726882569.07927: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f8f080> <<< 15621 1726882569.07964: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f8f1d0> <<< 15621 1726882569.08028: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15621 1726882569.08147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15621 1726882569.08150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f94dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f94b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15621 1726882569.08241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15621 1726882569.08293: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.08447: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f972c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f95430> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15621 1726882569.08521: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f9eab0> <<< 15621 1726882569.08695: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f97440> <<< 15621 1726882569.08794: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9fda0> <<< 15621 1726882569.08841: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9f770> <<< 15621 1726882569.08900: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.08927: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9fdd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f8f4a0> <<< 15621 1726882569.09133: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15621 1726882569.09150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa35c0> <<< 15621 1726882569.09329: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.09358: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa4710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa1d30> <<< 15621 1726882569.09421: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa30b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa1940> <<< 15621 1726882569.09444: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15621 1726882569.09584: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.09740: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 15621 1726882569.09825: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15621 1726882569.09991: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.10186: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.11167: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.12185: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15621 1726882569.12209: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15621 1726882569.12432: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e289e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15621 1726882569.12460: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e29d90> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa7c50> <<< 15621 1726882569.12661: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15621 1726882569.12842: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.13103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15621 1726882569.13129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e295b0> <<< 15621 1726882569.13147: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.13983: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.14783: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.14964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15015: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15621 1726882569.15030: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15088: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15135: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15621 1726882569.15138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15245: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15543: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 15621 1726882569.15556: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.15960: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.16376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15621 1726882569.16453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15621 1726882569.16488: stdout chunk (state=3): >>>import '_ast' # <<< 15621 1726882569.16564: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e2a810> <<< 15621 1726882569.16597: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.16733: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.16939: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15621 1726882569.16983: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.17148: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e322d0> <<< 15621 1726882569.17302: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e32bd0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440bf0> <<< 15621 1726882569.17306: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.17367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.17373: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15621 1726882569.17579: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.17582: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.17668: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15621 1726882569.17832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e31880> <<< 15621 1726882569.17898: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e32cc0> <<< 15621 1726882569.17937: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 15621 1726882569.18038: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15621 1726882569.18054: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.18146: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.18183: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.18249: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882569.18437: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15621 1726882569.18507: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ec6e10> <<< 15621 1726882569.18581: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e3cbc0> <<< 15621 1726882569.18705: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e36d50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e36bd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15621 1726882569.18723: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.18820: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15621 1726882569.18869: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15621 1726882569.18903: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15621 1726882569.19001: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19026: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19094: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19326: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.19376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15621 1726882569.19405: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19498: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19621: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19654: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.19705: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15621 1726882569.19714: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.20010: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.20314: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.20374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.20458: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882569.20492: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15621 1726882569.20519: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15621 1726882569.20545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15621 1726882569.20572: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecdc70> <<< 15621 1726882569.20599: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15621 1726882569.20625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15621 1726882569.20637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15621 1726882569.20872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690c440> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690c7a0> <<< 15621 1726882569.20875: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ead490> <<< 15621 1726882569.20896: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726eac710> <<< 15621 1726882569.20935: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecc380> <<< 15621 1726882569.20939: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecffb0> <<< 15621 1726882569.20974: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15621 1726882569.21041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15621 1726882569.21079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15621 1726882569.21121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15621 1726882569.21153: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690f770> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690f020> <<< 15621 1726882569.21198: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.21201: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690f200> <<< 15621 1726882569.21538: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690e450> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690f890> <<< 15621 1726882569.21556: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269763c0> <<< 15621 1726882569.21735: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269743e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecd430> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15621 1726882569.21828: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.21852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.21935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15621 1726882569.22041: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22044: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15621 1726882569.22120: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22189: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15621 1726882569.22250: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 15621 1726882569.22297: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22426: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 15621 1726882569.22429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22515: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15621 1726882569.22561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22708: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.22989: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15621 1726882569.23024: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.23850: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.24681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15621 1726882569.24685: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.24757: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.24847: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.24901: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.25054: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.25058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15621 1726882569.25077: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.25186: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.25358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 15621 1726882569.25387: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15621 1726882569.25626: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 15621 1726882569.25650: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.25733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15621 1726882569.25756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15621 1726882569.25786: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269760f0> <<< 15621 1726882569.25796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15621 1726882569.25839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15621 1726882569.26302: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269771a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15621 1726882569.26344: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.26521: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15621 1726882569.26526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.26542: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.26589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15621 1726882569.26606: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.26651: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.26691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15621 1726882569.26749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15621 1726882569.26814: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.26885: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269a6630> <<< 15621 1726882569.27140: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726993500> import 'ansible.module_utils.facts.system.python' # <<< 15621 1726882569.27166: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27196: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15621 1726882569.27316: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27333: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.27702: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 15621 1726882569.27749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15621 1726882569.27868: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.27907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15621 1726882569.27951: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.28061: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269c2120> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726993620> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15621 1726882569.28079: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.28192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15621 1726882569.28195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.28373: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.28477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15621 1726882569.28483: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.28772: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.28849: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.28891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 15621 1726882569.28937: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.28976: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.29335: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.29499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15621 1726882569.29713: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.29936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15621 1726882569.29972: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.30009: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.30043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.31175: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.31669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15621 1726882569.31673: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.31830: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.31959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15621 1726882569.32238: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.32577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.32856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.32860: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.33067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.33395: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 15621 1726882569.33422: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.33447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15621 1726882569.33729: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.33769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.33797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15621 1726882569.33800: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.33866: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.33938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15621 1726882569.34231: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.37075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15621 1726882569.37349: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882569.37500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15621 1726882569.37540: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882569.38058: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15621 1726882569.38082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15621 1726882569.38112: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882569.38166: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872629faa0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872629e690> <<< 15621 1726882569.38190: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872629c4d0> <<< 15621 1726882571.10153: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87262e5130> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87262e6030> <<< 15621 1726882571.10366: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15621 1726882571.10371: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872699c800> <<< 15621 1726882571.10388: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872699fcb0> <<< 15621 1726882571.10716: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15621 1726882571.31368: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.92333984375, "5m": 0.66650390625, "15m": 0.322265625}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "10", "epoch": "1726882570", "epoch_int": "1726882570", "date": "2024-09-20", "time": "21:36:10", "iso8601_micro": "2024-09-21T01:36:10.907938Z", "iso8601": "2024-09-21T01:36:10Z", "iso8601_basic": "20240920T213610907938", "iso8601_basic_short": "20240920T213610", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3068, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 648, "free": 3068}, "nocache": {"free": 3451, "used": 265}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 715, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384340480, "block_size": 4096, "block_total": 64483404, "block_available": 61373130, "block_used": 3110274, "inode_total": 16384000, "inode_available": 16303142, "inode_used": 80858, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882571.31971: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15621 1726882571.32003: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 15621 1726882571.32026: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr<<< 15621 1726882571.32061: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 15621 1726882571.32115: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os <<< 15621 1726882571.32145: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 15621 1726882571.32184: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 15621 1726882571.32217: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma<<< 15621 1726882571.32290: stdout chunk (state=3): >>> # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2<<< 15621 1726882571.32294: stdout chunk (state=3): >>> # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath<<< 15621 1726882571.32358: stdout chunk (state=3): >>> # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 15621 1726882571.32396: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 15621 1726882571.32658: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 15621 1726882571.32711: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux<<< 15621 1726882571.32748: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux<<< 15621 1726882571.32900: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep<<< 15621 1726882571.33093: stdout chunk (state=3): >>> # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15621 1726882571.33283: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15621 1726882571.33298: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15621 1726882571.33459: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15621 1726882571.33489: stdout chunk (state=3): >>># destroy zipfile<<< 15621 1726882571.33502: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 15621 1726882571.33554: stdout chunk (state=3): >>> # destroy ntpath <<< 15621 1726882571.33589: stdout chunk (state=3): >>># destroy importlib <<< 15621 1726882571.33617: stdout chunk (state=3): >>># destroy zipimport <<< 15621 1726882571.33632: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 15621 1726882571.33727: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy grp # destroy encodings<<< 15621 1726882571.33774: stdout chunk (state=3): >>> # destroy _locale # destroy locale # destroy select # destroy _signal<<< 15621 1726882571.33777: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog # destroy uuid<<< 15621 1726882571.33863: stdout chunk (state=3): >>> <<< 15621 1726882571.34000: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux<<< 15621 1726882571.34006: stdout chunk (state=3): >>> # destroy shutil # destroy distro # destroy distro.distro<<< 15621 1726882571.34046: stdout chunk (state=3): >>> # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 15621 1726882571.34069: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues <<< 15621 1726882571.34088: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 15621 1726882571.34132: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle <<< 15621 1726882571.34197: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 15621 1726882571.34220: stdout chunk (state=3): >>># destroy selectors # destroy shlex <<< 15621 1726882571.34307: stdout chunk (state=3): >>># destroy fcntl # destroy datetime <<< 15621 1726882571.34390: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 15621 1726882571.34410: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 15621 1726882571.34444: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob<<< 15621 1726882571.34582: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 15621 1726882571.34590: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 15621 1726882571.34637: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 15621 1726882571.34659: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 15621 1726882571.34737: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 15621 1726882571.34748: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 15621 1726882571.34933: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437<<< 15621 1726882571.34953: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 15621 1726882571.34981: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix<<< 15621 1726882571.35009: stdout chunk (state=3): >>> # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 15621 1726882571.35045: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 15621 1726882571.35068: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon <<< 15621 1726882571.35228: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15621 1726882571.35308: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15621 1726882571.35333: stdout chunk (state=3): >>># destroy _socket <<< 15621 1726882571.35507: stdout chunk (state=3): >>># destroy _collections # destroy platform <<< 15621 1726882571.35552: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15621 1726882571.35601: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize<<< 15621 1726882571.35754: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request <<< 15621 1726882571.35757: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules <<< 15621 1726882571.35841: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 15621 1726882571.35934: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 <<< 15621 1726882571.35953: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 15621 1726882571.36262: stdout chunk (state=3): >>># destroy encodings.idna # destroy _codecs # destroy io # destroy traceback<<< 15621 1726882571.36280: stdout chunk (state=3): >>> # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15621 1726882571.36790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.36906: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882571.36909: stdout chunk (state=3): >>><<< 15621 1726882571.36929: stderr chunk (state=3): >>><<< 15621 1726882571.37190: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87275a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727573b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87275a6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273791c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872737a000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273ef800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273efe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273cfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273cd1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274137d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274123f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273ce090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727410b00> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b4230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8727440d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8727440fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87273b2d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727441670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727441370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727442570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745dee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745ed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745f3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745e2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872745fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872745f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87274425d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872719fce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c8680> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87271c88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872719de80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c9f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271c8bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727442750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271f6300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720e480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872724b260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727269a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872724b380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270443b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872720d4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87271caea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8727044650> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_d7ygo3bx/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270ae180> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727085070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270841d0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727087fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270ddbb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270dd9d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270aeba0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270de900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87270deb40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87270deff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f40dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f429f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f433b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f44590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f46ff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f47110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f452b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f4aed0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f499a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f49700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f4bf50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f457c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f8f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f8f1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f94dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f94b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f972c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f95430> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f9eab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f97440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9fda0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9f770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726f9fdd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726f8f4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa35c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa4710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa1d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726fa30b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa1940> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e289e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e29d90> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726fa7c50> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e295b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e2a810> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e322d0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e32bd0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8727440bf0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8726e31880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e32cc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ec6e10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e3cbc0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e36d50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726e36bd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecdc70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690c440> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690c7a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ead490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726eac710> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecc380> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecffb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690f770> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690f020> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872690f200> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690e450> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872690f890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269763c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269743e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726ecd430> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269760f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87269771a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269a6630> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726993500> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f87269c2120> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8726993620> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f872629faa0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872629e690> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872629c4d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87262e5130> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f87262e6030> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872699c800> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f872699fcb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.92333984375, "5m": 0.66650390625, "15m": 0.322265625}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "10", "epoch": "1726882570", "epoch_int": "1726882570", "date": "2024-09-20", "time": "21:36:10", "iso8601_micro": "2024-09-21T01:36:10.907938Z", "iso8601": "2024-09-21T01:36:10Z", "iso8601_basic": "20240920T213610907938", "iso8601_basic_short": "20240920T213610", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3068, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 648, "free": 3068}, "nocache": {"free": 3451, "used": 265}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 715, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384340480, "block_size": 4096, "block_total": 64483404, "block_available": 61373130, "block_used": 3110274, "inode_total": 16384000, "inode_available": 16303142, "inode_used": 80858, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15621 1726882571.41476: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882571.41480: _low_level_execute_command(): starting 15621 1726882571.41483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882568.0475202-15644-105596870974071/ > /dev/null 2>&1 && sleep 0' 15621 1726882571.42415: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882571.42430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.42493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882571.42558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.42656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.45459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.45520: stderr chunk (state=3): >>><<< 15621 1726882571.45546: stdout chunk (state=3): >>><<< 15621 1726882571.45650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15621 1726882571.45664: handler run complete 15621 1726882571.46005: variable 'ansible_facts' from source: unknown 15621 1726882571.46172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.46535: variable 'ansible_facts' from source: unknown 15621 1726882571.46631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.46778: attempt loop complete, returning result 15621 1726882571.46788: _execute() done 15621 1726882571.46796: dumping result to json 15621 1726882571.46827: done dumping result, returning 15621 1726882571.46842: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-00000000007c] 15621 1726882571.46852: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000007c 15621 1726882571.47391: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000007c 15621 1726882571.47394: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882571.47987: no more pending results, returning what we have 15621 1726882571.47991: results queue empty 15621 1726882571.47991: checking for any_errors_fatal 15621 1726882571.47993: done checking for any_errors_fatal 15621 1726882571.47994: checking for max_fail_percentage 15621 1726882571.47995: done checking for max_fail_percentage 15621 1726882571.47996: checking to see if all hosts have failed and the running result is not ok 15621 1726882571.47997: done checking to see if all hosts have failed 15621 1726882571.47998: getting the remaining hosts for this loop 15621 1726882571.48000: done getting the remaining hosts for this loop 15621 1726882571.48004: getting the next task for host managed_node3 15621 1726882571.48010: done getting next task for host managed_node3 15621 1726882571.48012: ^ task is: TASK: meta (flush_handlers) 15621 1726882571.48014: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882571.48017: getting variables 15621 1726882571.48019: in VariableManager get_vars() 15621 1726882571.48043: Calling all_inventory to load vars for managed_node3 15621 1726882571.48046: Calling groups_inventory to load vars for managed_node3 15621 1726882571.48049: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882571.48060: Calling all_plugins_play to load vars for managed_node3 15621 1726882571.48063: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882571.48066: Calling groups_plugins_play to load vars for managed_node3 15621 1726882571.48256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.48475: done with get_vars() 15621 1726882571.48486: done getting variables 15621 1726882571.48583: in VariableManager get_vars() 15621 1726882571.48592: Calling all_inventory to load vars for managed_node3 15621 1726882571.48595: Calling groups_inventory to load vars for managed_node3 15621 1726882571.48597: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882571.48601: Calling all_plugins_play to load vars for managed_node3 15621 1726882571.48604: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882571.48607: Calling groups_plugins_play to load vars for managed_node3 15621 1726882571.48775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.48989: done with get_vars() 15621 1726882571.49002: done queuing things up, now waiting for results queue to drain 15621 1726882571.49004: results queue empty 15621 1726882571.49005: checking for any_errors_fatal 15621 1726882571.49007: done checking for any_errors_fatal 15621 1726882571.49008: checking for max_fail_percentage 15621 1726882571.49009: done checking for max_fail_percentage 15621 1726882571.49009: checking to see if all hosts have failed and the running result is not ok 15621 1726882571.49014: done checking to see if all hosts have failed 15621 1726882571.49015: getting the remaining hosts for this loop 15621 1726882571.49016: done getting the remaining hosts for this loop 15621 1726882571.49018: getting the next task for host managed_node3 15621 1726882571.49024: done getting next task for host managed_node3 15621 1726882571.49027: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15621 1726882571.49028: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882571.49030: getting variables 15621 1726882571.49031: in VariableManager get_vars() 15621 1726882571.49039: Calling all_inventory to load vars for managed_node3 15621 1726882571.49041: Calling groups_inventory to load vars for managed_node3 15621 1726882571.49043: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882571.49048: Calling all_plugins_play to load vars for managed_node3 15621 1726882571.49051: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882571.49054: Calling groups_plugins_play to load vars for managed_node3 15621 1726882571.49201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.49414: done with get_vars() 15621 1726882571.49423: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 21:36:11 -0400 (0:00:03.529) 0:00:03.574 ****** 15621 1726882571.49496: entering _queue_task() for managed_node3/include_tasks 15621 1726882571.49502: Creating lock for include_tasks 15621 1726882571.49946: worker is 1 (out of 1 available) 15621 1726882571.49960: exiting _queue_task() for managed_node3/include_tasks 15621 1726882571.49976: done queuing things up, now waiting for results queue to drain 15621 1726882571.49978: waiting for pending results... 15621 1726882571.50220: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 15621 1726882571.50331: in run() - task 0affc7ec-ae25-af1a-5b92-000000000006 15621 1726882571.50355: variable 'ansible_search_path' from source: unknown 15621 1726882571.50398: calling self._execute() 15621 1726882571.50486: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882571.50500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882571.50513: variable 'omit' from source: magic vars 15621 1726882571.50633: _execute() done 15621 1726882571.50643: dumping result to json 15621 1726882571.50652: done dumping result, returning 15621 1726882571.50668: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affc7ec-ae25-af1a-5b92-000000000006] 15621 1726882571.50683: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000006 15621 1726882571.50938: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000006 15621 1726882571.50941: WORKER PROCESS EXITING 15621 1726882571.51012: no more pending results, returning what we have 15621 1726882571.51017: in VariableManager get_vars() 15621 1726882571.51053: Calling all_inventory to load vars for managed_node3 15621 1726882571.51056: Calling groups_inventory to load vars for managed_node3 15621 1726882571.51060: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882571.51080: Calling all_plugins_play to load vars for managed_node3 15621 1726882571.51084: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882571.51087: Calling groups_plugins_play to load vars for managed_node3 15621 1726882571.51442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.51632: done with get_vars() 15621 1726882571.51639: variable 'ansible_search_path' from source: unknown 15621 1726882571.51652: we have included files to process 15621 1726882571.51653: generating all_blocks data 15621 1726882571.51654: done generating all_blocks data 15621 1726882571.51655: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15621 1726882571.51657: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15621 1726882571.51659: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15621 1726882571.52361: in VariableManager get_vars() 15621 1726882571.52380: done with get_vars() 15621 1726882571.52393: done processing included file 15621 1726882571.52395: iterating over new_blocks loaded from include file 15621 1726882571.52397: in VariableManager get_vars() 15621 1726882571.52407: done with get_vars() 15621 1726882571.52409: filtering new block on tags 15621 1726882571.52426: done filtering new block on tags 15621 1726882571.52430: in VariableManager get_vars() 15621 1726882571.52440: done with get_vars() 15621 1726882571.52442: filtering new block on tags 15621 1726882571.52459: done filtering new block on tags 15621 1726882571.52483: in VariableManager get_vars() 15621 1726882571.52495: done with get_vars() 15621 1726882571.52497: filtering new block on tags 15621 1726882571.52511: done filtering new block on tags 15621 1726882571.52513: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 15621 1726882571.52518: extending task lists for all hosts with included blocks 15621 1726882571.52575: done extending task lists 15621 1726882571.52577: done processing included files 15621 1726882571.52578: results queue empty 15621 1726882571.52578: checking for any_errors_fatal 15621 1726882571.52580: done checking for any_errors_fatal 15621 1726882571.52580: checking for max_fail_percentage 15621 1726882571.52581: done checking for max_fail_percentage 15621 1726882571.52582: checking to see if all hosts have failed and the running result is not ok 15621 1726882571.52583: done checking to see if all hosts have failed 15621 1726882571.52584: getting the remaining hosts for this loop 15621 1726882571.52585: done getting the remaining hosts for this loop 15621 1726882571.52587: getting the next task for host managed_node3 15621 1726882571.52591: done getting next task for host managed_node3 15621 1726882571.52593: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15621 1726882571.52596: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882571.52598: getting variables 15621 1726882571.52599: in VariableManager get_vars() 15621 1726882571.52607: Calling all_inventory to load vars for managed_node3 15621 1726882571.52609: Calling groups_inventory to load vars for managed_node3 15621 1726882571.52611: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882571.52616: Calling all_plugins_play to load vars for managed_node3 15621 1726882571.52619: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882571.52624: Calling groups_plugins_play to load vars for managed_node3 15621 1726882571.52761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882571.53101: done with get_vars() 15621 1726882571.53109: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:36:11 -0400 (0:00:00.036) 0:00:03.611 ****** 15621 1726882571.53182: entering _queue_task() for managed_node3/setup 15621 1726882571.53959: worker is 1 (out of 1 available) 15621 1726882571.53976: exiting _queue_task() for managed_node3/setup 15621 1726882571.53988: done queuing things up, now waiting for results queue to drain 15621 1726882571.53989: waiting for pending results... 15621 1726882571.54457: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 15621 1726882571.54527: in run() - task 0affc7ec-ae25-af1a-5b92-00000000008d 15621 1726882571.54550: variable 'ansible_search_path' from source: unknown 15621 1726882571.54629: variable 'ansible_search_path' from source: unknown 15621 1726882571.54633: calling self._execute() 15621 1726882571.54699: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882571.54712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882571.54729: variable 'omit' from source: magic vars 15621 1726882571.55317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882571.57877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882571.57967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882571.58064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882571.58129: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882571.58144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882571.58388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882571.58630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882571.58634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882571.58636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882571.58639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882571.58735: variable 'ansible_facts' from source: unknown 15621 1726882571.58811: variable 'network_test_required_facts' from source: task vars 15621 1726882571.58859: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15621 1726882571.58873: variable 'omit' from source: magic vars 15621 1726882571.58919: variable 'omit' from source: magic vars 15621 1726882571.58966: variable 'omit' from source: magic vars 15621 1726882571.59028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882571.59034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882571.59055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882571.59085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882571.59101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882571.59139: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882571.59177: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882571.59180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882571.59268: Set connection var ansible_connection to ssh 15621 1726882571.59291: Set connection var ansible_shell_executable to /bin/sh 15621 1726882571.59327: Set connection var ansible_timeout to 10 15621 1726882571.59331: Set connection var ansible_shell_type to sh 15621 1726882571.59334: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882571.59336: Set connection var ansible_pipelining to False 15621 1726882571.59359: variable 'ansible_shell_executable' from source: unknown 15621 1726882571.59367: variable 'ansible_connection' from source: unknown 15621 1726882571.59392: variable 'ansible_module_compression' from source: unknown 15621 1726882571.59395: variable 'ansible_shell_type' from source: unknown 15621 1726882571.59398: variable 'ansible_shell_executable' from source: unknown 15621 1726882571.59400: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882571.59428: variable 'ansible_pipelining' from source: unknown 15621 1726882571.59432: variable 'ansible_timeout' from source: unknown 15621 1726882571.59434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882571.59699: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882571.59719: variable 'omit' from source: magic vars 15621 1726882571.59786: starting attempt loop 15621 1726882571.59789: running the handler 15621 1726882571.59792: _low_level_execute_command(): starting 15621 1726882571.59794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882571.60598: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882571.60640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882571.60656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882571.60707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.60756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882571.60773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882571.60811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.60936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.63395: stdout chunk (state=3): >>>/root <<< 15621 1726882571.63617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.63620: stdout chunk (state=3): >>><<< 15621 1726882571.63624: stderr chunk (state=3): >>><<< 15621 1726882571.63645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15621 1726882571.63656: _low_level_execute_command(): starting 15621 1726882571.63664: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773 `" && echo ansible-tmp-1726882571.6364439-15765-200820459952773="` echo /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773 `" ) && sleep 0' 15621 1726882571.64120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882571.64126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882571.64129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882571.64131: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882571.64133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.64210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882571.64214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.64290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.66994: stdout chunk (state=3): >>>ansible-tmp-1726882571.6364439-15765-200820459952773=/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773 <<< 15621 1726882571.67146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.67201: stderr chunk (state=3): >>><<< 15621 1726882571.67218: stdout chunk (state=3): >>><<< 15621 1726882571.67247: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882571.6364439-15765-200820459952773=/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15621 1726882571.67316: variable 'ansible_module_compression' from source: unknown 15621 1726882571.67360: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882571.67431: variable 'ansible_facts' from source: unknown 15621 1726882571.67582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py 15621 1726882571.67748: Sending initial data 15621 1726882571.67752: Sent initial data (154 bytes) 15621 1726882571.68336: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882571.68443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.68465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882571.68491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882571.68507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.68632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.70508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882571.70619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882571.70705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsgj33ydj /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py <<< 15621 1726882571.70754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py" <<< 15621 1726882571.70840: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsgj33ydj" to remote "/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py" <<< 15621 1726882571.73384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.73425: stderr chunk (state=3): >>><<< 15621 1726882571.73440: stdout chunk (state=3): >>><<< 15621 1726882571.73562: done transferring module to remote 15621 1726882571.73565: _low_level_execute_command(): starting 15621 1726882571.73568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/ /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py && sleep 0' 15621 1726882571.73987: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882571.73991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882571.73994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.74004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.74056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882571.74074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.74163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.76488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882571.76494: stdout chunk (state=3): >>><<< 15621 1726882571.76497: stderr chunk (state=3): >>><<< 15621 1726882571.76514: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15621 1726882571.76517: _low_level_execute_command(): starting 15621 1726882571.76527: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/AnsiballZ_setup.py && sleep 0' 15621 1726882571.76959: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882571.77007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882571.77011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882571.77104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15621 1726882571.79967: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15621 1726882571.80091: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15621 1726882571.80173: stdout chunk (state=3): >>>import '_io' # <<< 15621 1726882571.80193: stdout chunk (state=3): >>>import 'marshal' # <<< 15621 1726882571.80257: stdout chunk (state=3): >>>import 'posix' # <<< 15621 1726882571.80304: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15621 1726882571.80318: stdout chunk (state=3): >>># installing zipimport hook <<< 15621 1726882571.80357: stdout chunk (state=3): >>>import 'time' # <<< 15621 1726882571.80381: stdout chunk (state=3): >>>import 'zipimport' # <<< 15621 1726882571.80389: stdout chunk (state=3): >>># installed zipimport hook <<< 15621 1726882571.80494: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15621 1726882571.80498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882571.80709: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15621 1726882571.80959: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3feab0> <<< 15621 1726882571.80962: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15621 1726882571.81031: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15621 1726882571.81091: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15621 1726882571.81151: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 15621 1726882571.81185: stdout chunk (state=3): >>> <<< 15621 1726882571.81228: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages'<<< 15621 1726882571.81240: stdout chunk (state=3): >>> <<< 15621 1726882571.81273: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 15621 1726882571.81307: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15621 1726882571.81428: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d1d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15621 1726882571.81452: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882571.81478: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d1d2000> <<< 15621 1726882571.81521: stdout chunk (state=3): >>>import 'site' # <<< 15621 1726882571.81543: stdout chunk (state=3): >>> <<< 15621 1726882571.81574: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 15621 1726882571.81586: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information.<<< 15621 1726882571.81726: stdout chunk (state=3): >>> <<< 15621 1726882571.82086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15621 1726882571.82105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15621 1726882571.82130: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15621 1726882571.82344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15621 1726882571.82537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20ff50> <<< 15621 1726882571.82643: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d247830> <<< 15621 1726882571.82748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d247ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d227b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2251f0> <<< 15621 1726882571.82782: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20d040> <<< 15621 1726882571.82808: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15621 1726882571.82826: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15621 1726882571.82863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15621 1726882571.82914: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d26b7d0> <<< 15621 1726882571.82941: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d26a3f0> <<< 15621 1726882571.82969: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d268bf0> <<< 15621 1726882571.83021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15621 1726882571.83047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20c2f0> <<< 15621 1726882571.83067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15621 1726882571.83236: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d298ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d298f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 15621 1726882571.83357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d299670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d299340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29a570> <<< 15621 1726882571.83386: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 15621 1726882571.83389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15621 1726882571.83460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15621 1726882571.83465: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b4770> <<< 15621 1726882571.83513: stdout chunk (state=3): >>>import 'errno' # <<< 15621 1726882571.83543: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15621 1726882571.83575: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15621 1726882571.83586: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b6d50> <<< 15621 1726882571.83697: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.83701: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15621 1726882571.83760: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b7530> <<< 15621 1726882571.83788: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29a5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15621 1726882571.83908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15621 1726882571.83912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cffbcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15621 1726882571.83937: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d024590> <<< 15621 1726882571.83965: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024860> <<< 15621 1726882571.84013: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024a40> <<< 15621 1726882571.84037: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cff9e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15621 1726882571.84132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15621 1726882571.84209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15621 1726882571.84234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d0260f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d024d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29ac90> <<< 15621 1726882571.84249: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15621 1726882571.84297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882571.84323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15621 1726882571.84362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15621 1726882571.84443: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d04e480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15621 1726882571.84475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15621 1726882571.84537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15621 1726882571.84556: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d06a630> <<< 15621 1726882571.84567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15621 1726882571.84596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15621 1726882571.84654: stdout chunk (state=3): >>>import 'ntpath' # <<< 15621 1726882571.84743: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d09f410> <<< 15621 1726882571.84769: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15621 1726882571.84793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15621 1726882571.84894: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d0c5bb0> <<< 15621 1726882571.85068: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d09f530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d06b2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cea8590> <<< 15621 1726882571.85072: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d069670> <<< 15621 1726882571.85084: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d027020> <<< 15621 1726882571.85237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15621 1726882571.85262: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f681cea8860> <<< 15621 1726882571.85638: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_agakc8v1/ansible_setup_payload.zip' # zipimport: zlib available <<< 15621 1726882571.85754: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.85795: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15621 1726882571.85853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15621 1726882571.85959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15621 1726882571.86003: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15621 1726882571.86024: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf16390> import '_typing' # <<< 15621 1726882571.86307: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ceed280> <<< 15621 1726882571.86318: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ceec3e0> # zipimport: zlib available <<< 15621 1726882571.86347: stdout chunk (state=3): >>>import 'ansible' # <<< 15621 1726882571.86373: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882571.86392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 15621 1726882571.86426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.88734: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.90239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf14260> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882571.90264: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15621 1726882571.90294: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15621 1726882571.90316: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf41e20> <<< 15621 1726882571.90358: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf41bb0> <<< 15621 1726882571.90402: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf414c0> <<< 15621 1726882571.90430: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15621 1726882571.90444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15621 1726882571.90476: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf41910> <<< 15621 1726882571.90508: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf17020> import 'atexit' # <<< 15621 1726882571.90532: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf42b70> <<< 15621 1726882571.90583: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf42d50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15621 1726882571.90644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15621 1726882571.90674: stdout chunk (state=3): >>>import '_locale' # <<< 15621 1726882571.90734: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf43290> import 'pwd' # <<< 15621 1726882571.90783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15621 1726882571.90831: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdacfb0> <<< 15621 1726882571.90869: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdaebd0> <<< 15621 1726882571.90890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15621 1726882571.90909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15621 1726882571.91005: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdaf590> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 15621 1726882571.91014: stdout chunk (state=3): >>> <<< 15621 1726882571.91053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15621 1726882571.91084: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb0770> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15621 1726882571.91131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15621 1726882571.91228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15621 1726882571.91260: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb3230> <<< 15621 1726882571.91355: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d026f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb14f0><<< 15621 1726882571.91359: stdout chunk (state=3): >>> <<< 15621 1726882571.91379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 15621 1726882571.91435: stdout chunk (state=3): >>> <<< 15621 1726882571.91447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 15621 1726882571.91534: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15621 1726882571.91586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 15621 1726882571.91589: stdout chunk (state=3): >>> <<< 15621 1726882571.91608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15621 1726882571.91642: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb71d0> import '_tokenize' # <<< 15621 1726882571.91758: stdout chunk (state=3): >>> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5ca0> <<< 15621 1726882571.91780: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5a00> <<< 15621 1726882571.91811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15621 1726882571.91857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15621 1726882571.91999: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5f70><<< 15621 1726882571.92002: stdout chunk (state=3): >>> <<< 15621 1726882571.92059: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb1a00><<< 15621 1726882571.92063: stdout chunk (state=3): >>> <<< 15621 1726882571.92091: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882571.92151: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdfb230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 15621 1726882571.92169: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 15621 1726882571.92189: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfb440><<< 15621 1726882571.92213: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15621 1726882571.92252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 15621 1726882571.92303: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 15621 1726882571.92306: stdout chunk (state=3): >>> <<< 15621 1726882571.92353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882571.92378: stdout chunk (state=3): >>> <<< 15621 1726882571.92401: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdfcef0><<< 15621 1726882571.92414: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfcce0><<< 15621 1726882571.92442: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 15621 1726882571.92450: stdout chunk (state=3): >>> <<< 15621 1726882571.92637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 15621 1726882571.92711: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.92740: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.92759: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdff380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfd580><<< 15621 1726882571.92799: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 15621 1726882571.92814: stdout chunk (state=3): >>> <<< 15621 1726882571.92881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 15621 1726882571.92919: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 15621 1726882571.92924: stdout chunk (state=3): >>> <<< 15621 1726882571.92960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15621 1726882571.93225: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0ab40> <<< 15621 1726882571.93276: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdff4d0><<< 15621 1726882571.93286: stdout chunk (state=3): >>> <<< 15621 1726882571.93402: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.93431: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.93442: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0b7d0> <<< 15621 1726882571.93486: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882571.93513: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.93537: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0ba10> <<< 15621 1726882571.93606: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882571.93619: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.93655: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0bdd0> <<< 15621 1726882571.93679: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfb620><<< 15621 1726882571.93695: stdout chunk (state=3): >>> <<< 15621 1726882571.93711: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 15621 1726882571.93737: stdout chunk (state=3): >>> <<< 15621 1726882571.93767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 15621 1726882571.93790: stdout chunk (state=3): >>> <<< 15621 1726882571.93812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 15621 1726882571.93868: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.93906: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882571.94025: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0f350> <<< 15621 1726882571.94215: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.94251: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce10800> <<< 15621 1726882571.94284: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0db20> <<< 15621 1726882571.94334: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.94366: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0eea0><<< 15621 1726882571.94384: stdout chunk (state=3): >>> <<< 15621 1726882571.94412: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0d7f0> # zipimport: zlib available <<< 15621 1726882571.94448: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.94461: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 15621 1726882571.94532: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.94655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.94806: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.94838: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882571.94849: stdout chunk (state=3): >>> import 'ansible.module_utils.common' # <<< 15621 1726882571.94887: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882571.94903: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.94927: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 15621 1726882571.95169: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available<<< 15621 1726882571.95183: stdout chunk (state=3): >>> <<< 15621 1726882571.95409: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.95412: stdout chunk (state=3): >>> <<< 15621 1726882571.96480: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.96483: stdout chunk (state=3): >>> <<< 15621 1726882571.97528: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15621 1726882571.97554: stdout chunk (state=3): >>> <<< 15621 1726882571.97580: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 15621 1726882571.97593: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 15621 1726882571.97626: stdout chunk (state=3): >>> <<< 15621 1726882571.97649: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15621 1726882571.97683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 15621 1726882571.97781: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.97788: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882571.97810: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cc988c0> <<< 15621 1726882571.97955: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 15621 1726882571.97972: stdout chunk (state=3): >>> <<< 15621 1726882571.97994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc996a0> <<< 15621 1726882571.98017: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0c0e0><<< 15621 1726882571.98090: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 15621 1726882571.98119: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.98174: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.98204: stdout chunk (state=3): >>> import 'ansible.module_utils._text' # <<< 15621 1726882571.98434: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882571.98529: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.98540: stdout chunk (state=3): >>> <<< 15621 1726882571.98830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 15621 1726882571.98833: stdout chunk (state=3): >>> <<< 15621 1726882571.98871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc99640> <<< 15621 1726882571.98901: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.98912: stdout chunk (state=3): >>> <<< 15621 1726882571.99806: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882571.99830: stdout chunk (state=3): >>> <<< 15621 1726882572.00663: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.00803: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.00951: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15621 1726882572.00972: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.01047: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.01066: stdout chunk (state=3): >>> <<< 15621 1726882572.01111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15621 1726882572.01114: stdout chunk (state=3): >>> <<< 15621 1726882572.01226: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.01265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.01427: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15621 1726882572.01461: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.01502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.01520: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 15621 1726882572.01543: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.01625: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.01647: stdout chunk (state=3): >>> <<< 15621 1726882572.01693: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15621 1726882572.01705: stdout chunk (state=3): >>> <<< 15621 1726882572.01734: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.02170: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.02190: stdout chunk (state=3): >>> <<< 15621 1726882572.02753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15621 1726882572.02757: stdout chunk (state=3): >>>import '_ast' # <<< 15621 1726882572.02893: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc9a390> <<< 15621 1726882572.02904: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.03037: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.03176: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15621 1726882572.03205: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15621 1726882572.03258: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 15621 1726882572.03261: stdout chunk (state=3): >>> <<< 15621 1726882572.03282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 15621 1726882572.03401: stdout chunk (state=3): >>> # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.03668: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca2180> <<< 15621 1726882572.03719: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.03756: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca2b10> <<< 15621 1726882572.03791: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298bf0><<< 15621 1726882572.03802: stdout chunk (state=3): >>> <<< 15621 1726882572.04063: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.04081: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.04136: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.04239: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.04362: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15621 1726882572.04443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 15621 1726882572.04462: stdout chunk (state=3): >>> <<< 15621 1726882572.04615: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.04640: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.04702: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca1790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cca2c90> <<< 15621 1726882572.04842: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15621 1726882572.04860: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.04931: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.05093: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available<<< 15621 1726882572.05106: stdout chunk (state=3): >>> <<< 15621 1726882572.05196: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882572.05233: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 15621 1726882572.05266: stdout chunk (state=3): >>> <<< 15621 1726882572.05351: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15621 1726882572.05450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15621 1726882572.05481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 15621 1726882572.05526: stdout chunk (state=3): >>> <<< 15621 1726882572.05626: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd32de0><<< 15621 1726882572.05728: stdout chunk (state=3): >>> <<< 15621 1726882572.05756: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaca70> <<< 15621 1726882572.05798: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaac00><<< 15621 1726882572.05820: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaaa80> # destroy ansible.module_utils.distro<<< 15621 1726882572.05908: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 15621 1726882572.05911: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.06174: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15621 1726882572.06177: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15621 1726882572.06361: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 15621 1726882572.06408: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.06447: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.06532: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.06619: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.06701: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.06757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15621 1726882572.06777: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.06918: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.07087: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.07177: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15621 1726882572.07198: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.07511: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.07737: stdout chunk (state=3): >>> <<< 15621 1726882572.07852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.07919: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.08011: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py<<< 15621 1726882572.08030: stdout chunk (state=3): >>> <<< 15621 1726882572.08134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15621 1726882572.08170: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd39b80> <<< 15621 1726882572.08211: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15621 1726882572.08287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15621 1726882572.08295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 15621 1726882572.08298: stdout chunk (state=3): >>> <<< 15621 1726882572.08453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15621 1726882572.08656: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bc350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882572.08659: stdout chunk (state=3): >>> <<< 15621 1726882572.08662: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bc680> <<< 15621 1726882572.08754: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd193a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd18320> <<< 15621 1726882572.08758: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd38260><<< 15621 1726882572.08816: stdout chunk (state=3): >>> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd38d10><<< 15621 1726882572.08825: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py<<< 15621 1726882572.08835: stdout chunk (state=3): >>> <<< 15621 1726882572.08984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 15621 1726882572.09031: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 15621 1726882572.09034: stdout chunk (state=3): >>> <<< 15621 1726882572.09105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.09109: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.09146: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bf6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bef90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.09427: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bf170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2be3c0> <<< 15621 1726882572.09431: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15621 1726882572.09433: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bf890><<< 15621 1726882572.09436: stdout chunk (state=3): >>> <<< 15621 1726882572.09462: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15621 1726882572.09516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc'<<< 15621 1726882572.09532: stdout chunk (state=3): >>> <<< 15621 1726882572.09565: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882572.09588: stdout chunk (state=3): >>> # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 15621 1726882572.09608: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c326360><<< 15621 1726882572.09618: stdout chunk (state=3): >>> <<< 15621 1726882572.09678: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bffe0><<< 15621 1726882572.09682: stdout chunk (state=3): >>> <<< 15621 1726882572.09777: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd392e0> <<< 15621 1726882572.09907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.09911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 15621 1726882572.10059: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 15621 1726882572.10093: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.10187: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.10277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15621 1726882572.10285: stdout chunk (state=3): >>> <<< 15621 1726882572.10309: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.10313: stdout chunk (state=3): >>> <<< 15621 1726882572.10336: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15621 1726882572.10374: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.10377: stdout chunk (state=3): >>> <<< 15621 1726882572.10473: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 15621 1726882572.10495: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.10587: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.10674: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.caps' # <<< 15621 1726882572.10690: stdout chunk (state=3): >>> <<< 15621 1726882572.10701: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.10707: stdout chunk (state=3): >>> <<< 15621 1726882572.10784: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.10791: stdout chunk (state=3): >>> <<< 15621 1726882572.10864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15621 1726882572.10867: stdout chunk (state=3): >>> <<< 15621 1726882572.10882: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.10988: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.10992: stdout chunk (state=3): >>> <<< 15621 1726882572.11098: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.11210: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.11314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15621 1726882572.11341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 15621 1726882572.11368: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.11375: stdout chunk (state=3): >>> <<< 15621 1726882572.12650: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.13136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available<<< 15621 1726882572.13239: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.13254: stdout chunk (state=3): >>> <<< 15621 1726882572.13341: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.13355: stdout chunk (state=3): >>> <<< 15621 1726882572.13411: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.13432: stdout chunk (state=3): >>> <<< 15621 1726882572.13519: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 15621 1726882572.13539: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available<<< 15621 1726882572.13569: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.13586: stdout chunk (state=3): >>> <<< 15621 1726882572.13718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available<<< 15621 1726882572.13741: stdout chunk (state=3): >>> <<< 15621 1726882572.13773: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.13882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15621 1726882572.13905: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.13919: stdout chunk (state=3): >>> <<< 15621 1726882572.14058: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 15621 1726882572.14064: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.14090: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.14143: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.loadavg' # <<< 15621 1726882572.14197: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.14313: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.14465: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15621 1726882572.14525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15621 1726882572.14618: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c326600> <<< 15621 1726882572.14626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15621 1726882572.14646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15621 1726882572.14870: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c327260> <<< 15621 1726882572.14901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 15621 1726882572.14908: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15036: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.15148: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.lsb' # <<< 15621 1726882572.15177: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15344: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15621 1726882572.15540: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15666: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15621 1726882572.15837: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.15840: stdout chunk (state=3): >>> <<< 15621 1726882572.15920: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.15982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15621 1726882572.16124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15621 1726882572.16159: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.16293: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.16306: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c35a840> <<< 15621 1726882572.16655: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c3435c0> <<< 15621 1726882572.16727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 15621 1726882572.16953: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available<<< 15621 1726882572.17025: stdout chunk (state=3): >>> <<< 15621 1726882572.17078: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.17229: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.17434: stdout chunk (state=3): >>># zipimport: zlib available<<< 15621 1726882572.17497: stdout chunk (state=3): >>> <<< 15621 1726882572.17695: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 15621 1726882572.17819: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15621 1726882572.18051: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.18064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15621 1726882572.18099: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882572.18309: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c0e6000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c3589e0> <<< 15621 1726882572.18312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 15621 1726882572.18315: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15621 1726882572.18399: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.18489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15621 1726882572.18507: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15621 1726882572.18802: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15621 1726882572.19171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15621 1726882572.19242: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19439: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19521: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19556: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 15621 1726882572.19761: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19838: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.19852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.20040: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.20187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15621 1726882572.20239: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.20467: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 15621 1726882572.20470: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.20498: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.20537: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.21134: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.21697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15621 1726882572.21825: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.21828: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.21938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15621 1726882572.21952: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.22041: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.22161: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15621 1726882572.22164: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.22327: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.22638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 15621 1726882572.22642: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15621 1726882572.22661: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15621 1726882572.22737: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.22841: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23060: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15621 1726882572.23324: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23446: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23457: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15621 1726882572.23526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 15621 1726882572.23631: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 15621 1726882572.23793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15621 1726882572.23848: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.23913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15621 1726882572.24209: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.24553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15621 1726882572.24582: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.24624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15621 1726882572.24674: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.24711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.24714: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15621 1726882572.25083: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 15621 1726882572.25087: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.25089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15621 1726882572.25130: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.25134: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.25194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 15621 1726882572.25235: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.25292: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.25427: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.25487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 15621 1726882572.25500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15621 1726882572.25550: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.25664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15621 1726882572.25828: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.26044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15621 1726882572.26098: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.26214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15621 1726882572.26246: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15621 1726882572.26264: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.26361: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.26506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 15621 1726882572.26775: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15621 1726882572.26928: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882572.27104: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15621 1726882572.27133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15621 1726882572.27195: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c10ea80> <<< 15621 1726882572.27228: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c10c3e0> <<< 15621 1726882572.27274: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c10cc80> <<< 15621 1726882573.66455: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "13", "epoch": "1726882573", "epoch_int": "1726882573", "date": "2024-09-20", "time": "21:36:13", "iso8601_micro": "2024-09-21T01:36:13.654312Z", "iso8601": "2024-09-21T01:36:13Z", "iso8601_basic": "20240920T213613654312", "iso8601_basic_short": "20240920T213613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto<<< 15621 1726882573.66461: stdout chunk (state=3): >>>", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882573.67073: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15621 1726882573.67129: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections <<< 15621 1726882573.67168: stdout chunk (state=3): >>># cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 15621 1726882573.67237: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string <<< 15621 1726882573.67241: stdout chunk (state=3): >>># destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings <<< 15621 1726882573.67299: stdout chunk (state=3): >>># destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 15621 1726882573.67315: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro <<< 15621 1726882573.67363: stdout chunk (state=3): >>># cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 15621 1726882573.67394: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi <<< 15621 1726882573.67475: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix <<< 15621 1726882573.67480: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd <<< 15621 1726882573.67494: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15621 1726882573.67809: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15621 1726882573.67824: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15621 1726882573.67869: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma<<< 15621 1726882573.67895: stdout chunk (state=3): >>> # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 15621 1726882573.67932: stdout chunk (state=3): >>># destroy ntpath <<< 15621 1726882573.67958: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 15621 1726882573.68020: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 15621 1726882573.68046: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15621 1726882573.68095: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 15621 1726882573.68142: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 15621 1726882573.68197: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 15621 1726882573.68232: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 15621 1726882573.68268: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 15621 1726882573.68284: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 15621 1726882573.68332: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 15621 1726882573.68364: stdout chunk (state=3): >>># destroy errno # destroy json <<< 15621 1726882573.68376: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 15621 1726882573.68408: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 15621 1726882573.68469: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string <<< 15621 1726882573.68494: stdout chunk (state=3): >>># cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 15621 1726882573.68505: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 15621 1726882573.68549: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre <<< 15621 1726882573.68561: stdout chunk (state=3): >>># cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 15621 1726882573.68589: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 15621 1726882573.68593: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 15621 1726882573.68632: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 15621 1726882573.68637: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15621 1726882573.68640: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 15621 1726882573.68660: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15621 1726882573.68807: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15621 1726882573.68836: stdout chunk (state=3): >>># destroy _collections <<< 15621 1726882573.68865: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 15621 1726882573.68868: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser <<< 15621 1726882573.68888: stdout chunk (state=3): >>># destroy tokenize <<< 15621 1726882573.68901: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15621 1726882573.68946: stdout chunk (state=3): >>># destroy _typing <<< 15621 1726882573.68961: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 15621 1726882573.68968: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15621 1726882573.68996: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15621 1726882573.69100: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 15621 1726882573.69113: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 15621 1726882573.69123: stdout chunk (state=3): >>># destroy time <<< 15621 1726882573.69143: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 15621 1726882573.69180: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 15621 1726882573.69200: stdout chunk (state=3): >>># destroy itertools <<< 15621 1726882573.69215: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 15621 1726882573.69218: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 15621 1726882573.69615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882573.69687: stderr chunk (state=3): >>><<< 15621 1726882573.69690: stdout chunk (state=3): >>><<< 15621 1726882573.69792: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d3feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d1d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d1d2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d247830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d247ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d227b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2251f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d26b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d26a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d268bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d298ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d298f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d20ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d299670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d299340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29a570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d2b7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d2b7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29a5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cffbcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d024590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d024a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cff9e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d0260f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d024d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d29ac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d04e480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d06a630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d09f410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d0c5bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d09f530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d06b2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cea8590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d069670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d027020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f681cea8860> # zipimport: found 103 names in '/tmp/ansible_setup_payload_agakc8v1/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf16390> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ceed280> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ceec3e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf14260> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf41e20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf41bb0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf414c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf41910> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf17020> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf42b70> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cf42d50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cf43290> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdacfb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdaebd0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdaf590> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb0770> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb3230> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681d026f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb14f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb71d0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5ca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5a00> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb5f70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdb1a00> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdfb230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfb440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdfcef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfcce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cdff380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfd580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0ab40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdff4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0b7d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0ba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0bdd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cdfb620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0f350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce10800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0db20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ce0eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0d7f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cc988c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc996a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ce0c0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc99640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cc9a390> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca2180> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca2b10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681d298bf0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681cca1790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cca2c90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd32de0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaca70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaac00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ccaaa80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd39b80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bc350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bc680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd193a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd18320> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd38260> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd38d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bf6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bef90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c2bf170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2be3c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bf890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c326360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c2bffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cd392e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c326600> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c327260> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c35a840> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c3435c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c0e6000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c3589e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c10ea80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c10c3e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c10cc80> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "13", "epoch": "1726882573", "epoch_int": "1726882573", "date": "2024-09-20", "time": "21:36:13", "iso8601_micro": "2024-09-21T01:36:13.654312Z", "iso8601": "2024-09-21T01:36:13Z", "iso8601_basic": "20240920T213613654312", "iso8601_basic_short": "20240920T213613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15621 1726882573.71053: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882573.71073: _low_level_execute_command(): starting 15621 1726882573.71078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882571.6364439-15765-200820459952773/ > /dev/null 2>&1 && sleep 0' 15621 1726882573.71081: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882573.71084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.71086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882573.71088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882573.71090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.71176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882573.73088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882573.73141: stderr chunk (state=3): >>><<< 15621 1726882573.73145: stdout chunk (state=3): >>><<< 15621 1726882573.73159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882573.73166: handler run complete 15621 1726882573.73208: variable 'ansible_facts' from source: unknown 15621 1726882573.73266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882573.73351: variable 'ansible_facts' from source: unknown 15621 1726882573.73389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882573.73431: attempt loop complete, returning result 15621 1726882573.73434: _execute() done 15621 1726882573.73437: dumping result to json 15621 1726882573.73447: done dumping result, returning 15621 1726882573.73455: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affc7ec-ae25-af1a-5b92-00000000008d] 15621 1726882573.73460: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000008d 15621 1726882573.73608: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000008d 15621 1726882573.73611: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882573.73782: no more pending results, returning what we have 15621 1726882573.73785: results queue empty 15621 1726882573.73785: checking for any_errors_fatal 15621 1726882573.73786: done checking for any_errors_fatal 15621 1726882573.73787: checking for max_fail_percentage 15621 1726882573.73788: done checking for max_fail_percentage 15621 1726882573.73789: checking to see if all hosts have failed and the running result is not ok 15621 1726882573.73790: done checking to see if all hosts have failed 15621 1726882573.73791: getting the remaining hosts for this loop 15621 1726882573.73792: done getting the remaining hosts for this loop 15621 1726882573.73795: getting the next task for host managed_node3 15621 1726882573.73803: done getting next task for host managed_node3 15621 1726882573.73805: ^ task is: TASK: Check if system is ostree 15621 1726882573.73807: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882573.73810: getting variables 15621 1726882573.73812: in VariableManager get_vars() 15621 1726882573.73836: Calling all_inventory to load vars for managed_node3 15621 1726882573.73838: Calling groups_inventory to load vars for managed_node3 15621 1726882573.73841: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882573.73853: Calling all_plugins_play to load vars for managed_node3 15621 1726882573.73856: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882573.73859: Calling groups_plugins_play to load vars for managed_node3 15621 1726882573.73991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882573.74112: done with get_vars() 15621 1726882573.74126: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:36:13 -0400 (0:00:02.210) 0:00:05.821 ****** 15621 1726882573.74234: entering _queue_task() for managed_node3/stat 15621 1726882573.74561: worker is 1 (out of 1 available) 15621 1726882573.74578: exiting _queue_task() for managed_node3/stat 15621 1726882573.74594: done queuing things up, now waiting for results queue to drain 15621 1726882573.74596: waiting for pending results... 15621 1726882573.74885: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 15621 1726882573.75038: in run() - task 0affc7ec-ae25-af1a-5b92-00000000008f 15621 1726882573.75042: variable 'ansible_search_path' from source: unknown 15621 1726882573.75047: variable 'ansible_search_path' from source: unknown 15621 1726882573.75055: calling self._execute() 15621 1726882573.75135: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882573.75148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882573.75152: variable 'omit' from source: magic vars 15621 1726882573.75620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882573.75849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882573.75888: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882573.75915: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882573.75945: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882573.76021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882573.76043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882573.76065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882573.76088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882573.76189: Evaluated conditional (not __network_is_ostree is defined): True 15621 1726882573.76194: variable 'omit' from source: magic vars 15621 1726882573.76228: variable 'omit' from source: magic vars 15621 1726882573.76257: variable 'omit' from source: magic vars 15621 1726882573.76282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882573.76306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882573.76320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882573.76336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882573.76344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882573.76371: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882573.76377: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882573.76380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882573.76459: Set connection var ansible_connection to ssh 15621 1726882573.76465: Set connection var ansible_shell_executable to /bin/sh 15621 1726882573.76472: Set connection var ansible_timeout to 10 15621 1726882573.76477: Set connection var ansible_shell_type to sh 15621 1726882573.76483: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882573.76488: Set connection var ansible_pipelining to False 15621 1726882573.76511: variable 'ansible_shell_executable' from source: unknown 15621 1726882573.76515: variable 'ansible_connection' from source: unknown 15621 1726882573.76518: variable 'ansible_module_compression' from source: unknown 15621 1726882573.76520: variable 'ansible_shell_type' from source: unknown 15621 1726882573.76525: variable 'ansible_shell_executable' from source: unknown 15621 1726882573.76527: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882573.76529: variable 'ansible_pipelining' from source: unknown 15621 1726882573.76532: variable 'ansible_timeout' from source: unknown 15621 1726882573.76537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882573.76652: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882573.76660: variable 'omit' from source: magic vars 15621 1726882573.76665: starting attempt loop 15621 1726882573.76668: running the handler 15621 1726882573.76684: _low_level_execute_command(): starting 15621 1726882573.76691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882573.77253: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.77257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.77262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.77265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.77320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882573.77328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882573.77330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.77423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882573.79111: stdout chunk (state=3): >>>/root <<< 15621 1726882573.79220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882573.79282: stderr chunk (state=3): >>><<< 15621 1726882573.79287: stdout chunk (state=3): >>><<< 15621 1726882573.79313: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882573.79326: _low_level_execute_command(): starting 15621 1726882573.79334: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269 `" && echo ansible-tmp-1726882573.7931018-15862-20270910547269="` echo /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269 `" ) && sleep 0' 15621 1726882573.79829: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.79835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.79854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882573.79859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.79911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882573.79919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882573.79921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.80006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882573.81982: stdout chunk (state=3): >>>ansible-tmp-1726882573.7931018-15862-20270910547269=/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269 <<< 15621 1726882573.82096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882573.82160: stderr chunk (state=3): >>><<< 15621 1726882573.82163: stdout chunk (state=3): >>><<< 15621 1726882573.82181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882573.7931018-15862-20270910547269=/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882573.82238: variable 'ansible_module_compression' from source: unknown 15621 1726882573.82284: ANSIBALLZ: Using lock for stat 15621 1726882573.82288: ANSIBALLZ: Acquiring lock 15621 1726882573.82291: ANSIBALLZ: Lock acquired: 140146888267520 15621 1726882573.82293: ANSIBALLZ: Creating module 15621 1726882573.91062: ANSIBALLZ: Writing module into payload 15621 1726882573.91129: ANSIBALLZ: Writing module 15621 1726882573.91147: ANSIBALLZ: Renaming module 15621 1726882573.91156: ANSIBALLZ: Done creating module 15621 1726882573.91172: variable 'ansible_facts' from source: unknown 15621 1726882573.91220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py 15621 1726882573.91327: Sending initial data 15621 1726882573.91331: Sent initial data (152 bytes) 15621 1726882573.91812: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.91816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.91818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882573.91821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.91880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882573.91889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882573.91890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.91976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882573.93672: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882573.93756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882573.93841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpxw__feew /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py <<< 15621 1726882573.93843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py" <<< 15621 1726882573.93926: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpxw__feew" to remote "/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py" <<< 15621 1726882573.93931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py" <<< 15621 1726882573.94654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882573.94723: stderr chunk (state=3): >>><<< 15621 1726882573.94731: stdout chunk (state=3): >>><<< 15621 1726882573.94750: done transferring module to remote 15621 1726882573.94763: _low_level_execute_command(): starting 15621 1726882573.94767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/ /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py && sleep 0' 15621 1726882573.95246: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882573.95249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882573.95252: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.95254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.95256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.95310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882573.95315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.95403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882573.97258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882573.97305: stderr chunk (state=3): >>><<< 15621 1726882573.97308: stdout chunk (state=3): >>><<< 15621 1726882573.97326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882573.97329: _low_level_execute_command(): starting 15621 1726882573.97336: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/AnsiballZ_stat.py && sleep 0' 15621 1726882573.97799: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882573.97803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882573.97806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.97808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882573.97810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882573.97858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882573.97864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882573.97955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.00253: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15621 1726882574.00282: stdout chunk (state=3): >>>import _imp # builtin <<< 15621 1726882574.00316: stdout chunk (state=3): >>>import '_thread' # <<< 15621 1726882574.00320: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 15621 1726882574.00392: stdout chunk (state=3): >>>import '_io' # <<< 15621 1726882574.00397: stdout chunk (state=3): >>>import 'marshal' # <<< 15621 1726882574.00429: stdout chunk (state=3): >>>import 'posix' # <<< 15621 1726882574.00467: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15621 1726882574.00499: stdout chunk (state=3): >>>import 'time' # <<< 15621 1726882574.00502: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15621 1726882574.00560: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15621 1726882574.00568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.00578: stdout chunk (state=3): >>>import '_codecs' # <<< 15621 1726882574.00604: stdout chunk (state=3): >>>import 'codecs' # <<< 15621 1726882574.00636: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15621 1726882574.00675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15621 1726882574.00678: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbfc530> <<< 15621 1726882574.00682: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbcbb30> <<< 15621 1726882574.00703: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15621 1726882574.00716: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbfeab0> <<< 15621 1726882574.00738: stdout chunk (state=3): >>>import '_signal' # <<< 15621 1726882574.00760: stdout chunk (state=3): >>>import '_abc' # <<< 15621 1726882574.00772: stdout chunk (state=3): >>>import 'abc' # <<< 15621 1726882574.00784: stdout chunk (state=3): >>>import 'io' # <<< 15621 1726882574.00823: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15621 1726882574.00914: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15621 1726882574.00946: stdout chunk (state=3): >>>import 'genericpath' # <<< 15621 1726882574.00949: stdout chunk (state=3): >>>import 'posixpath' # <<< 15621 1726882574.00969: stdout chunk (state=3): >>>import 'os' # <<< 15621 1726882574.00998: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15621 1726882574.01009: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 15621 1726882574.01020: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15621 1726882574.01032: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15621 1726882574.01063: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15621 1726882574.01067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15621 1726882574.01088: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac9d11c0> <<< 15621 1726882574.01146: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15621 1726882574.01162: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.01166: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac9d2000> <<< 15621 1726882574.01191: stdout chunk (state=3): >>>import 'site' # <<< 15621 1726882574.01227: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15621 1726882574.01474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15621 1726882574.01477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15621 1726882574.01503: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15621 1726882574.01507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.01535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15621 1726882574.01572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15621 1726882574.01590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15621 1726882574.01612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15621 1726882574.01634: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0fe90> <<< 15621 1726882574.01652: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15621 1726882574.01667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15621 1726882574.01691: stdout chunk (state=3): >>>import '_operator' # <<< 15621 1726882574.01702: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0ff50> <<< 15621 1726882574.01709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15621 1726882574.01741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15621 1726882574.01763: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15621 1726882574.01818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.01829: stdout chunk (state=3): >>>import 'itertools' # <<< 15621 1726882574.01856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca47830> <<< 15621 1726882574.01886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15621 1726882574.01893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca47ec0> <<< 15621 1726882574.01903: stdout chunk (state=3): >>>import '_collections' # <<< 15621 1726882574.01962: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca27b00> <<< 15621 1726882574.01966: stdout chunk (state=3): >>>import '_functools' # <<< 15621 1726882574.01996: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca251f0> <<< 15621 1726882574.02095: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0d040> <<< 15621 1726882574.02125: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15621 1726882574.02140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15621 1726882574.02154: stdout chunk (state=3): >>>import '_sre' # <<< 15621 1726882574.02172: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15621 1726882574.02197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15621 1726882574.02228: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15621 1726882574.02259: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca6b7d0> <<< 15621 1726882574.02275: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca6a3f0> <<< 15621 1726882574.02301: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 15621 1726882574.02308: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca68bf0> <<< 15621 1726882574.02366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 15621 1726882574.02384: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca98830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0c2f0> <<< 15621 1726882574.02398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15621 1726882574.02444: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.02448: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4caca98ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca98b90> <<< 15621 1726882574.02482: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.02495: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4caca98f80> <<< 15621 1726882574.02497: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0ae40> <<< 15621 1726882574.02527: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.02555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15621 1726882574.02578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15621 1726882574.02595: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca99340> import 'importlib.machinery' # <<< 15621 1726882574.02632: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15621 1726882574.02657: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9a570> <<< 15621 1726882574.02676: stdout chunk (state=3): >>>import 'importlib.util' # <<< 15621 1726882574.02679: stdout chunk (state=3): >>>import 'runpy' # <<< 15621 1726882574.02695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15621 1726882574.02732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15621 1726882574.02758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15621 1726882574.02772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab4770> <<< 15621 1726882574.02785: stdout chunk (state=3): >>>import 'errno' # <<< 15621 1726882574.02808: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.02811: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab5eb0> <<< 15621 1726882574.02842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15621 1726882574.02847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15621 1726882574.02872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15621 1726882574.02885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15621 1726882574.02891: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab6d50> <<< 15621 1726882574.02924: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.02936: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab73b0> <<< 15621 1726882574.02938: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab62a0> <<< 15621 1726882574.02963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15621 1726882574.02966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15621 1726882574.03013: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.03024: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab7e00> <<< 15621 1726882574.03031: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab7530> <<< 15621 1726882574.03072: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9a5a0> <<< 15621 1726882574.03088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15621 1726882574.03118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15621 1726882574.03136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15621 1726882574.03155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15621 1726882574.03197: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac877cb0><<< 15621 1726882574.03208: stdout chunk (state=3): >>> <<< 15621 1726882574.03218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15621 1726882574.03245: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.03249: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a0500> <<< 15621 1726882574.03273: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a0740> <<< 15621 1726882574.03305: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.03315: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a0950> <<< 15621 1726882574.03320: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac875e50> <<< 15621 1726882574.03344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15621 1726882574.03441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15621 1726882574.03468: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15621 1726882574.03481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15621 1726882574.03484: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a1fd0> <<< 15621 1726882574.03502: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a0c50> <<< 15621 1726882574.03531: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9ac90> <<< 15621 1726882574.03555: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15621 1726882574.03603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.03621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15621 1726882574.03673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15621 1726882574.03692: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8ce390> <<< 15621 1726882574.03746: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15621 1726882574.03757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.03781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15621 1726882574.03796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15621 1726882574.03851: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e6510> <<< 15621 1726882574.03867: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15621 1726882574.03907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15621 1726882574.03958: stdout chunk (state=3): >>>import 'ntpath' # <<< 15621 1726882574.03987: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac91f2c0> <<< 15621 1726882574.04008: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15621 1726882574.04042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15621 1726882574.04073: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15621 1726882574.04105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15621 1726882574.04203: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac941a60> <<< 15621 1726882574.04275: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac91f3e0> <<< 15621 1726882574.04320: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e6bd0> <<< 15621 1726882574.04347: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 15621 1726882574.04365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac71c440> <<< 15621 1726882574.04368: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e5550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a2f30> <<< 15621 1726882574.04468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15621 1726882574.04490: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4cac8e5910> <<< 15621 1726882574.04568: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_y0uvxg45/ansible_stat_payload.zip' <<< 15621 1726882574.04575: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.04719: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.04752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15621 1726882574.04757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15621 1726882574.04801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15621 1726882574.04879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15621 1726882574.04913: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac7761b0> <<< 15621 1726882574.04920: stdout chunk (state=3): >>>import '_typing' # <<< 15621 1726882574.05114: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74d0a0> <<< 15621 1726882574.05120: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74c230> # zipimport: zlib available <<< 15621 1726882574.05151: stdout chunk (state=3): >>>import 'ansible' # <<< 15621 1726882574.05156: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.05178: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.05188: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.05205: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 15621 1726882574.05213: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.06751: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.07992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15621 1726882574.07998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74f5c0> <<< 15621 1726882574.08020: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.08052: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15621 1726882574.08073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15621 1726882574.08084: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15621 1726882574.08119: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.08125: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79dbe0> <<< 15621 1726882574.08152: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d970> <<< 15621 1726882574.08195: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d280> <<< 15621 1726882574.08209: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15621 1726882574.08217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15621 1726882574.08261: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d6d0> <<< 15621 1726882574.08264: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac776e40> <<< 15621 1726882574.08269: stdout chunk (state=3): >>>import 'atexit' # <<< 15621 1726882574.08293: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79e990> <<< 15621 1726882574.08321: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.08334: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79ebd0> <<< 15621 1726882574.08344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15621 1726882574.08385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15621 1726882574.08407: stdout chunk (state=3): >>>import '_locale' # <<< 15621 1726882574.08453: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79f050> <<< 15621 1726882574.08458: stdout chunk (state=3): >>>import 'pwd' # <<< 15621 1726882574.08477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15621 1726882574.08505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15621 1726882574.08537: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac600d40> <<< 15621 1726882574.08568: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.08581: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6029f0> <<< 15621 1726882574.08596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15621 1726882574.08599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15621 1726882574.08647: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac603350> <<< 15621 1726882574.08659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15621 1726882574.08689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15621 1726882574.08707: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac604290> <<< 15621 1726882574.08729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15621 1726882574.08759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15621 1726882574.08782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15621 1726882574.08846: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac606f90> <<< 15621 1726882574.08888: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.08892: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6070b0> <<< 15621 1726882574.08901: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac605160> <<< 15621 1726882574.08927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15621 1726882574.08953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15621 1726882574.08973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15621 1726882574.08994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15621 1726882574.09019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15621 1726882574.09044: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15621 1726882574.09067: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac60ae10> import '_tokenize' # <<< 15621 1726882574.09143: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6098e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac609670> <<< 15621 1726882574.09168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15621 1726882574.09256: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac60bcb0> <<< 15621 1726882574.09281: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac605760> <<< 15621 1726882574.09304: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac652ff0> <<< 15621 1726882574.09334: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac653110> <<< 15621 1726882574.09361: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15621 1726882574.09379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15621 1726882574.09398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15621 1726882574.09438: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac654d40> <<< 15621 1726882574.09442: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac654b00> <<< 15621 1726882574.09457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15621 1726882574.09573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15621 1726882574.09619: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.09627: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac657260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac655400> <<< 15621 1726882574.09648: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15621 1726882574.09691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.09713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15621 1726882574.09716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15621 1726882574.09735: stdout chunk (state=3): >>>import '_string' # <<< 15621 1726882574.09769: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac65ea20> <<< 15621 1726882574.09903: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6573e0> <<< 15621 1726882574.09975: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.09981: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65f890> <<< 15621 1726882574.10007: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.10012: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65f710> <<< 15621 1726882574.10054: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.10062: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65fce0> <<< 15621 1726882574.10068: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac653410> <<< 15621 1726882574.10090: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15621 1726882574.10112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15621 1726882574.10131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15621 1726882574.10163: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.10188: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac663500> <<< 15621 1726882574.10363: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.10366: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac664740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac661c70> <<< 15621 1726882574.10404: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac663020> <<< 15621 1726882574.10412: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac661880> <<< 15621 1726882574.10430: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10439: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15621 1726882574.10464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10554: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10657: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10682: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 15621 1726882574.10689: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882574.10709: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 15621 1726882574.10716: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10838: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.10967: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.11564: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.12217: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15621 1726882574.12236: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15621 1726882574.12240: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15621 1726882574.12245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.12285: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.12299: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6e8890> <<< 15621 1726882574.12375: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15621 1726882574.12405: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6e95e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac667bf0> <<< 15621 1726882574.12453: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15621 1726882574.12473: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.12510: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15621 1726882574.12663: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.12975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15621 1726882574.12978: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6e96d0> # zipimport: zlib available <<< 15621 1726882574.13362: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.13949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15621 1726882574.14008: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15621 1726882574.14057: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14100: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15621 1726882574.14182: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14301: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 15621 1726882574.14313: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15621 1726882574.14365: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14404: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15621 1726882574.14417: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14661: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.14912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15621 1726882574.14960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15621 1726882574.14981: stdout chunk (state=3): >>>import '_ast' # <<< 15621 1726882574.15050: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6eb290> <<< 15621 1726882574.15057: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15132: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15213: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 15621 1726882574.15217: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15621 1726882574.15247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15621 1726882574.15251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15621 1726882574.15329: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.15460: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f6330><<< 15621 1726882574.15466: stdout chunk (state=3): >>> <<< 15621 1726882574.15508: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.15512: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15621 1726882574.15534: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f6c60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac667e60> # zipimport: zlib available <<< 15621 1726882574.15584: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15619: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15621 1726882574.15637: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15678: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15726: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15782: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.15855: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15621 1726882574.15887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.15975: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f5910> <<< 15621 1726882574.16014: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6f6de0> <<< 15621 1726882574.16046: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15621 1726882574.16049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16125: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16188: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16216: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16254: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15621 1726882574.16286: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15621 1726882574.16295: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15621 1726882574.16320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15621 1726882574.16375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15621 1726882574.16400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15621 1726882574.16408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15621 1726882574.16470: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac586ed0> <<< 15621 1726882574.16512: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac503c50> <<< 15621 1726882574.16600: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6fbe60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6faba0> # destroy ansible.module_utils.distro <<< 15621 1726882574.16607: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 15621 1726882574.16641: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16669: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 15621 1726882574.16674: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 15621 1726882574.16730: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15621 1726882574.16738: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16750: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16768: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 15621 1726882574.16777: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.16915: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.17115: stdout chunk (state=3): >>># zipimport: zlib available <<< 15621 1726882574.17230: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 15621 1726882574.17256: stdout chunk (state=3): >>># destroy __main__ <<< 15621 1726882574.17546: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 15621 1726882574.17558: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback <<< 15621 1726882574.17575: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc <<< 15621 1726882574.17608: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 15621 1726882574.17630: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib <<< 15621 1726882574.17645: stdout chunk (state=3): >>># cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 15621 1726882574.17651: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 15621 1726882574.17686: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters <<< 15621 1726882574.17699: stdout chunk (state=3): >>># cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 15621 1726882574.17713: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15621 1726882574.17942: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15621 1726882574.17958: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15621 1726882574.17978: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii<<< 15621 1726882574.17992: stdout chunk (state=3): >>> # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib <<< 15621 1726882574.17996: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 15621 1726882574.18031: stdout chunk (state=3): >>># destroy ntpath <<< 15621 1726882574.18037: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 15621 1726882574.18061: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 15621 1726882574.18078: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 15621 1726882574.18092: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15621 1726882574.18104: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array <<< 15621 1726882574.18115: stdout chunk (state=3): >>># destroy datetime <<< 15621 1726882574.18131: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 <<< 15621 1726882574.18146: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 15621 1726882574.18153: stdout chunk (state=3): >>># destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15621 1726882574.18196: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 15621 1726882574.18201: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 15621 1726882574.18220: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 15621 1726882574.18226: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 15621 1726882574.18248: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 15621 1726882574.18263: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 15621 1726882574.18290: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 15621 1726882574.18319: stdout chunk (state=3): >>># destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 15621 1726882574.18325: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 15621 1726882574.18337: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15621 1726882574.18469: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15621 1726882574.18475: stdout chunk (state=3): >>># destroy _socket <<< 15621 1726882574.18490: stdout chunk (state=3): >>># destroy _collections <<< 15621 1726882574.18511: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 15621 1726882574.18527: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 15621 1726882574.18542: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15621 1726882574.18554: stdout chunk (state=3): >>># destroy _typing <<< 15621 1726882574.18580: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 15621 1726882574.18586: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15621 1726882574.18597: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15621 1726882574.18683: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 15621 1726882574.18686: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 15621 1726882574.18711: stdout chunk (state=3): >>># destroy time <<< 15621 1726882574.18718: stdout chunk (state=3): >>># destroy _random <<< 15621 1726882574.18728: stdout chunk (state=3): >>># destroy _weakref <<< 15621 1726882574.18741: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 15621 1726882574.18777: stdout chunk (state=3): >>># destroy itertools <<< 15621 1726882574.18781: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 15621 1726882574.18783: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 15621 1726882574.19134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882574.19196: stderr chunk (state=3): >>><<< 15621 1726882574.19199: stdout chunk (state=3): >>><<< 15621 1726882574.19264: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacbfeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac9d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac9d2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca47830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca47ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca27b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca251f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca6b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca6a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca68bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca98830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4caca98ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca98b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4caca98f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca0ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca99340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9a570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cacab7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cacab7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9a5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac877cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a0740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac8a0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac875e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a1fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a0c50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4caca9ac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8ce390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e6510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac91f2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac941a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac91f3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e6bd0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac71c440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8e5550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac8a2f30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4cac8e5910> # zipimport: found 30 names in '/tmp/ansible_stat_payload_y0uvxg45/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac7761b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74d0a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74c230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac74f5c0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79dbe0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79d6d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac776e40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79e990> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac79ebd0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac79f050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac600d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6029f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac603350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac604290> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac606f90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6070b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac605160> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac60ae10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6098e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac609670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac60bcb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac605760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac652ff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac653110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac654d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac654b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac657260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac655400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac65ea20> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6573e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65f890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65f710> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac65fce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac653410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac663500> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac664740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac661c70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac663020> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac661880> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6e8890> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6e95e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac667bf0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6e96d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6eb290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f6330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f6c60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac667e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4cac6f5910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6f6de0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac586ed0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac503c50> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6fbe60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4cac6faba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15621 1726882574.19819: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882574.19824: _low_level_execute_command(): starting 15621 1726882574.19827: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882573.7931018-15862-20270910547269/ > /dev/null 2>&1 && sleep 0' 15621 1726882574.19968: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882574.19971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882574.19974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.19976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882574.19984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882574.19994: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.20026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882574.20043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.20131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.22031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882574.22075: stderr chunk (state=3): >>><<< 15621 1726882574.22079: stdout chunk (state=3): >>><<< 15621 1726882574.22093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882574.22100: handler run complete 15621 1726882574.22119: attempt loop complete, returning result 15621 1726882574.22123: _execute() done 15621 1726882574.22126: dumping result to json 15621 1726882574.22130: done dumping result, returning 15621 1726882574.22138: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affc7ec-ae25-af1a-5b92-00000000008f] 15621 1726882574.22143: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000008f 15621 1726882574.22235: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000008f 15621 1726882574.22238: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15621 1726882574.22298: no more pending results, returning what we have 15621 1726882574.22300: results queue empty 15621 1726882574.22301: checking for any_errors_fatal 15621 1726882574.22307: done checking for any_errors_fatal 15621 1726882574.22308: checking for max_fail_percentage 15621 1726882574.22310: done checking for max_fail_percentage 15621 1726882574.22310: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.22311: done checking to see if all hosts have failed 15621 1726882574.22312: getting the remaining hosts for this loop 15621 1726882574.22313: done getting the remaining hosts for this loop 15621 1726882574.22317: getting the next task for host managed_node3 15621 1726882574.22325: done getting next task for host managed_node3 15621 1726882574.22327: ^ task is: TASK: Set flag to indicate system is ostree 15621 1726882574.22330: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.22334: getting variables 15621 1726882574.22335: in VariableManager get_vars() 15621 1726882574.22365: Calling all_inventory to load vars for managed_node3 15621 1726882574.22368: Calling groups_inventory to load vars for managed_node3 15621 1726882574.22371: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.22382: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.22385: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.22388: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.22552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.22702: done with get_vars() 15621 1726882574.22709: done getting variables 15621 1726882574.22786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:36:14 -0400 (0:00:00.485) 0:00:06.307 ****** 15621 1726882574.22808: entering _queue_task() for managed_node3/set_fact 15621 1726882574.22809: Creating lock for set_fact 15621 1726882574.23019: worker is 1 (out of 1 available) 15621 1726882574.23035: exiting _queue_task() for managed_node3/set_fact 15621 1726882574.23049: done queuing things up, now waiting for results queue to drain 15621 1726882574.23050: waiting for pending results... 15621 1726882574.23212: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 15621 1726882574.23282: in run() - task 0affc7ec-ae25-af1a-5b92-000000000090 15621 1726882574.23292: variable 'ansible_search_path' from source: unknown 15621 1726882574.23296: variable 'ansible_search_path' from source: unknown 15621 1726882574.23330: calling self._execute() 15621 1726882574.23388: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.23394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.23403: variable 'omit' from source: magic vars 15621 1726882574.23762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882574.23942: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882574.23980: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882574.24005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882574.24035: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882574.24125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882574.24146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882574.24166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882574.24190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882574.24286: Evaluated conditional (not __network_is_ostree is defined): True 15621 1726882574.24289: variable 'omit' from source: magic vars 15621 1726882574.24316: variable 'omit' from source: magic vars 15621 1726882574.24403: variable '__ostree_booted_stat' from source: set_fact 15621 1726882574.24443: variable 'omit' from source: magic vars 15621 1726882574.24463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882574.24488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882574.24504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882574.24518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.24528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.24552: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882574.24556: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.24558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.24634: Set connection var ansible_connection to ssh 15621 1726882574.24642: Set connection var ansible_shell_executable to /bin/sh 15621 1726882574.24647: Set connection var ansible_timeout to 10 15621 1726882574.24650: Set connection var ansible_shell_type to sh 15621 1726882574.24655: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882574.24660: Set connection var ansible_pipelining to False 15621 1726882574.24681: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.24685: variable 'ansible_connection' from source: unknown 15621 1726882574.24687: variable 'ansible_module_compression' from source: unknown 15621 1726882574.24691: variable 'ansible_shell_type' from source: unknown 15621 1726882574.24694: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.24696: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.24698: variable 'ansible_pipelining' from source: unknown 15621 1726882574.24701: variable 'ansible_timeout' from source: unknown 15621 1726882574.24705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.24780: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882574.24788: variable 'omit' from source: magic vars 15621 1726882574.24793: starting attempt loop 15621 1726882574.24796: running the handler 15621 1726882574.24805: handler run complete 15621 1726882574.24813: attempt loop complete, returning result 15621 1726882574.24816: _execute() done 15621 1726882574.24819: dumping result to json 15621 1726882574.24824: done dumping result, returning 15621 1726882574.24834: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affc7ec-ae25-af1a-5b92-000000000090] 15621 1726882574.24837: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000090 15621 1726882574.24915: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000090 15621 1726882574.24920: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15621 1726882574.24977: no more pending results, returning what we have 15621 1726882574.24980: results queue empty 15621 1726882574.24981: checking for any_errors_fatal 15621 1726882574.24987: done checking for any_errors_fatal 15621 1726882574.24988: checking for max_fail_percentage 15621 1726882574.24989: done checking for max_fail_percentage 15621 1726882574.24990: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.24991: done checking to see if all hosts have failed 15621 1726882574.24992: getting the remaining hosts for this loop 15621 1726882574.24993: done getting the remaining hosts for this loop 15621 1726882574.24996: getting the next task for host managed_node3 15621 1726882574.25004: done getting next task for host managed_node3 15621 1726882574.25006: ^ task is: TASK: Fix CentOS6 Base repo 15621 1726882574.25008: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.25011: getting variables 15621 1726882574.25013: in VariableManager get_vars() 15621 1726882574.25045: Calling all_inventory to load vars for managed_node3 15621 1726882574.25047: Calling groups_inventory to load vars for managed_node3 15621 1726882574.25050: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.25059: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.25062: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.25070: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.25180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.25296: done with get_vars() 15621 1726882574.25303: done getting variables 15621 1726882574.25390: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:36:14 -0400 (0:00:00.026) 0:00:06.333 ****** 15621 1726882574.25410: entering _queue_task() for managed_node3/copy 15621 1726882574.25594: worker is 1 (out of 1 available) 15621 1726882574.25607: exiting _queue_task() for managed_node3/copy 15621 1726882574.25620: done queuing things up, now waiting for results queue to drain 15621 1726882574.25624: waiting for pending results... 15621 1726882574.25765: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 15621 1726882574.25827: in run() - task 0affc7ec-ae25-af1a-5b92-000000000092 15621 1726882574.25838: variable 'ansible_search_path' from source: unknown 15621 1726882574.25841: variable 'ansible_search_path' from source: unknown 15621 1726882574.25873: calling self._execute() 15621 1726882574.25931: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.25937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.25945: variable 'omit' from source: magic vars 15621 1726882574.26336: variable 'ansible_distribution' from source: facts 15621 1726882574.26351: Evaluated conditional (ansible_distribution == 'CentOS'): False 15621 1726882574.26354: when evaluation is False, skipping this task 15621 1726882574.26357: _execute() done 15621 1726882574.26360: dumping result to json 15621 1726882574.26365: done dumping result, returning 15621 1726882574.26371: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affc7ec-ae25-af1a-5b92-000000000092] 15621 1726882574.26379: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000092 15621 1726882574.26474: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000092 15621 1726882574.26477: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 15621 1726882574.26558: no more pending results, returning what we have 15621 1726882574.26561: results queue empty 15621 1726882574.26561: checking for any_errors_fatal 15621 1726882574.26565: done checking for any_errors_fatal 15621 1726882574.26565: checking for max_fail_percentage 15621 1726882574.26573: done checking for max_fail_percentage 15621 1726882574.26574: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.26575: done checking to see if all hosts have failed 15621 1726882574.26576: getting the remaining hosts for this loop 15621 1726882574.26577: done getting the remaining hosts for this loop 15621 1726882574.26580: getting the next task for host managed_node3 15621 1726882574.26591: done getting next task for host managed_node3 15621 1726882574.26595: ^ task is: TASK: Include the task 'enable_epel.yml' 15621 1726882574.26597: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.26601: getting variables 15621 1726882574.26602: in VariableManager get_vars() 15621 1726882574.26628: Calling all_inventory to load vars for managed_node3 15621 1726882574.26631: Calling groups_inventory to load vars for managed_node3 15621 1726882574.26634: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.26644: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.26647: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.26650: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.26785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.26900: done with get_vars() 15621 1726882574.26908: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:36:14 -0400 (0:00:00.015) 0:00:06.348 ****** 15621 1726882574.26970: entering _queue_task() for managed_node3/include_tasks 15621 1726882574.27146: worker is 1 (out of 1 available) 15621 1726882574.27161: exiting _queue_task() for managed_node3/include_tasks 15621 1726882574.27173: done queuing things up, now waiting for results queue to drain 15621 1726882574.27175: waiting for pending results... 15621 1726882574.27311: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 15621 1726882574.27377: in run() - task 0affc7ec-ae25-af1a-5b92-000000000093 15621 1726882574.27387: variable 'ansible_search_path' from source: unknown 15621 1726882574.27390: variable 'ansible_search_path' from source: unknown 15621 1726882574.27420: calling self._execute() 15621 1726882574.27478: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.27483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.27491: variable 'omit' from source: magic vars 15621 1726882574.27840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882574.29961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882574.30019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882574.30071: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882574.30128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882574.30166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882574.30238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882574.30259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882574.30279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882574.30315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882574.30328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882574.30418: variable '__network_is_ostree' from source: set_fact 15621 1726882574.30434: Evaluated conditional (not __network_is_ostree | d(false)): True 15621 1726882574.30440: _execute() done 15621 1726882574.30443: dumping result to json 15621 1726882574.30448: done dumping result, returning 15621 1726882574.30454: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affc7ec-ae25-af1a-5b92-000000000093] 15621 1726882574.30459: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000093 15621 1726882574.30546: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000093 15621 1726882574.30549: WORKER PROCESS EXITING 15621 1726882574.30582: no more pending results, returning what we have 15621 1726882574.30587: in VariableManager get_vars() 15621 1726882574.30620: Calling all_inventory to load vars for managed_node3 15621 1726882574.30624: Calling groups_inventory to load vars for managed_node3 15621 1726882574.30628: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.30639: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.30642: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.30645: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.30787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.30905: done with get_vars() 15621 1726882574.30911: variable 'ansible_search_path' from source: unknown 15621 1726882574.30912: variable 'ansible_search_path' from source: unknown 15621 1726882574.30940: we have included files to process 15621 1726882574.30940: generating all_blocks data 15621 1726882574.30943: done generating all_blocks data 15621 1726882574.30949: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15621 1726882574.30950: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15621 1726882574.30952: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15621 1726882574.31456: done processing included file 15621 1726882574.31458: iterating over new_blocks loaded from include file 15621 1726882574.31459: in VariableManager get_vars() 15621 1726882574.31467: done with get_vars() 15621 1726882574.31468: filtering new block on tags 15621 1726882574.31483: done filtering new block on tags 15621 1726882574.31485: in VariableManager get_vars() 15621 1726882574.31493: done with get_vars() 15621 1726882574.31494: filtering new block on tags 15621 1726882574.31502: done filtering new block on tags 15621 1726882574.31504: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 15621 1726882574.31507: extending task lists for all hosts with included blocks 15621 1726882574.31572: done extending task lists 15621 1726882574.31573: done processing included files 15621 1726882574.31574: results queue empty 15621 1726882574.31574: checking for any_errors_fatal 15621 1726882574.31576: done checking for any_errors_fatal 15621 1726882574.31577: checking for max_fail_percentage 15621 1726882574.31578: done checking for max_fail_percentage 15621 1726882574.31578: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.31579: done checking to see if all hosts have failed 15621 1726882574.31579: getting the remaining hosts for this loop 15621 1726882574.31580: done getting the remaining hosts for this loop 15621 1726882574.31582: getting the next task for host managed_node3 15621 1726882574.31584: done getting next task for host managed_node3 15621 1726882574.31586: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15621 1726882574.31588: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.31589: getting variables 15621 1726882574.31590: in VariableManager get_vars() 15621 1726882574.31595: Calling all_inventory to load vars for managed_node3 15621 1726882574.31596: Calling groups_inventory to load vars for managed_node3 15621 1726882574.31598: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.31603: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.31609: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.31611: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.31704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.31816: done with get_vars() 15621 1726882574.31825: done getting variables 15621 1726882574.31872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15621 1726882574.32010: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:36:14 -0400 (0:00:00.050) 0:00:06.399 ****** 15621 1726882574.32056: entering _queue_task() for managed_node3/command 15621 1726882574.32059: Creating lock for command 15621 1726882574.32552: worker is 1 (out of 1 available) 15621 1726882574.32559: exiting _queue_task() for managed_node3/command 15621 1726882574.32568: done queuing things up, now waiting for results queue to drain 15621 1726882574.32570: waiting for pending results... 15621 1726882574.32699: running TaskExecutor() for managed_node3/TASK: Create EPEL 40 15621 1726882574.32795: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000ad 15621 1726882574.32799: variable 'ansible_search_path' from source: unknown 15621 1726882574.32801: variable 'ansible_search_path' from source: unknown 15621 1726882574.32804: calling self._execute() 15621 1726882574.32882: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.32895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.32916: variable 'omit' from source: magic vars 15621 1726882574.33299: variable 'ansible_distribution' from source: facts 15621 1726882574.33315: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15621 1726882574.33325: when evaluation is False, skipping this task 15621 1726882574.33332: _execute() done 15621 1726882574.33429: dumping result to json 15621 1726882574.33433: done dumping result, returning 15621 1726882574.33435: done running TaskExecutor() for managed_node3/TASK: Create EPEL 40 [0affc7ec-ae25-af1a-5b92-0000000000ad] 15621 1726882574.33437: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000ad 15621 1726882574.33512: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000ad 15621 1726882574.33516: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15621 1726882574.33572: no more pending results, returning what we have 15621 1726882574.33575: results queue empty 15621 1726882574.33576: checking for any_errors_fatal 15621 1726882574.33577: done checking for any_errors_fatal 15621 1726882574.33578: checking for max_fail_percentage 15621 1726882574.33579: done checking for max_fail_percentage 15621 1726882574.33580: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.33581: done checking to see if all hosts have failed 15621 1726882574.33582: getting the remaining hosts for this loop 15621 1726882574.33583: done getting the remaining hosts for this loop 15621 1726882574.33587: getting the next task for host managed_node3 15621 1726882574.33593: done getting next task for host managed_node3 15621 1726882574.33595: ^ task is: TASK: Install yum-utils package 15621 1726882574.33599: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.33603: getting variables 15621 1726882574.33605: in VariableManager get_vars() 15621 1726882574.33638: Calling all_inventory to load vars for managed_node3 15621 1726882574.33641: Calling groups_inventory to load vars for managed_node3 15621 1726882574.33645: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.33661: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.33664: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.33667: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.34005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.34217: done with get_vars() 15621 1726882574.34228: done getting variables 15621 1726882574.34314: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:36:14 -0400 (0:00:00.022) 0:00:06.422 ****** 15621 1726882574.34345: entering _queue_task() for managed_node3/package 15621 1726882574.34347: Creating lock for package 15621 1726882574.34587: worker is 1 (out of 1 available) 15621 1726882574.34600: exiting _queue_task() for managed_node3/package 15621 1726882574.34613: done queuing things up, now waiting for results queue to drain 15621 1726882574.34614: waiting for pending results... 15621 1726882574.35043: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 15621 1726882574.35048: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000ae 15621 1726882574.35051: variable 'ansible_search_path' from source: unknown 15621 1726882574.35053: variable 'ansible_search_path' from source: unknown 15621 1726882574.35056: calling self._execute() 15621 1726882574.35109: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.35121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.35141: variable 'omit' from source: magic vars 15621 1726882574.35521: variable 'ansible_distribution' from source: facts 15621 1726882574.35541: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15621 1726882574.35549: when evaluation is False, skipping this task 15621 1726882574.35573: _execute() done 15621 1726882574.35576: dumping result to json 15621 1726882574.35578: done dumping result, returning 15621 1726882574.35583: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affc7ec-ae25-af1a-5b92-0000000000ae] 15621 1726882574.35593: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000ae 15621 1726882574.35747: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000ae 15621 1726882574.35751: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15621 1726882574.35835: no more pending results, returning what we have 15621 1726882574.35839: results queue empty 15621 1726882574.35840: checking for any_errors_fatal 15621 1726882574.35847: done checking for any_errors_fatal 15621 1726882574.35848: checking for max_fail_percentage 15621 1726882574.35849: done checking for max_fail_percentage 15621 1726882574.35850: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.35851: done checking to see if all hosts have failed 15621 1726882574.35852: getting the remaining hosts for this loop 15621 1726882574.35853: done getting the remaining hosts for this loop 15621 1726882574.35858: getting the next task for host managed_node3 15621 1726882574.35864: done getting next task for host managed_node3 15621 1726882574.35866: ^ task is: TASK: Enable EPEL 7 15621 1726882574.35870: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.35874: getting variables 15621 1726882574.35876: in VariableManager get_vars() 15621 1726882574.35906: Calling all_inventory to load vars for managed_node3 15621 1726882574.35909: Calling groups_inventory to load vars for managed_node3 15621 1726882574.35912: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.35927: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.35932: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.35935: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.36215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.36417: done with get_vars() 15621 1726882574.36428: done getting variables 15621 1726882574.36484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:36:14 -0400 (0:00:00.021) 0:00:06.444 ****** 15621 1726882574.36512: entering _queue_task() for managed_node3/command 15621 1726882574.36734: worker is 1 (out of 1 available) 15621 1726882574.36746: exiting _queue_task() for managed_node3/command 15621 1726882574.36759: done queuing things up, now waiting for results queue to drain 15621 1726882574.36760: waiting for pending results... 15621 1726882574.37142: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 15621 1726882574.37146: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000af 15621 1726882574.37149: variable 'ansible_search_path' from source: unknown 15621 1726882574.37151: variable 'ansible_search_path' from source: unknown 15621 1726882574.37166: calling self._execute() 15621 1726882574.37243: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.37255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.37267: variable 'omit' from source: magic vars 15621 1726882574.37640: variable 'ansible_distribution' from source: facts 15621 1726882574.37657: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15621 1726882574.37664: when evaluation is False, skipping this task 15621 1726882574.37676: _execute() done 15621 1726882574.37684: dumping result to json 15621 1726882574.37692: done dumping result, returning 15621 1726882574.37701: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affc7ec-ae25-af1a-5b92-0000000000af] 15621 1726882574.37712: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000af skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15621 1726882574.38016: no more pending results, returning what we have 15621 1726882574.38018: results queue empty 15621 1726882574.38019: checking for any_errors_fatal 15621 1726882574.38026: done checking for any_errors_fatal 15621 1726882574.38027: checking for max_fail_percentage 15621 1726882574.38029: done checking for max_fail_percentage 15621 1726882574.38030: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.38031: done checking to see if all hosts have failed 15621 1726882574.38031: getting the remaining hosts for this loop 15621 1726882574.38032: done getting the remaining hosts for this loop 15621 1726882574.38036: getting the next task for host managed_node3 15621 1726882574.38041: done getting next task for host managed_node3 15621 1726882574.38043: ^ task is: TASK: Enable EPEL 8 15621 1726882574.38047: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.38050: getting variables 15621 1726882574.38052: in VariableManager get_vars() 15621 1726882574.38077: Calling all_inventory to load vars for managed_node3 15621 1726882574.38079: Calling groups_inventory to load vars for managed_node3 15621 1726882574.38082: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.38091: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.38094: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.38097: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.38278: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000af 15621 1726882574.38281: WORKER PROCESS EXITING 15621 1726882574.38305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.38512: done with get_vars() 15621 1726882574.38521: done getting variables 15621 1726882574.38581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:36:14 -0400 (0:00:00.020) 0:00:06.465 ****** 15621 1726882574.38611: entering _queue_task() for managed_node3/command 15621 1726882574.38828: worker is 1 (out of 1 available) 15621 1726882574.38839: exiting _queue_task() for managed_node3/command 15621 1726882574.38850: done queuing things up, now waiting for results queue to drain 15621 1726882574.38851: waiting for pending results... 15621 1726882574.39085: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 15621 1726882574.39195: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000b0 15621 1726882574.39216: variable 'ansible_search_path' from source: unknown 15621 1726882574.39226: variable 'ansible_search_path' from source: unknown 15621 1726882574.39266: calling self._execute() 15621 1726882574.39348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.39364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.39380: variable 'omit' from source: magic vars 15621 1726882574.39756: variable 'ansible_distribution' from source: facts 15621 1726882574.39773: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15621 1726882574.39781: when evaluation is False, skipping this task 15621 1726882574.39788: _execute() done 15621 1726882574.39796: dumping result to json 15621 1726882574.39803: done dumping result, returning 15621 1726882574.39813: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affc7ec-ae25-af1a-5b92-0000000000b0] 15621 1726882574.39824: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000b0 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15621 1726882574.40130: no more pending results, returning what we have 15621 1726882574.40133: results queue empty 15621 1726882574.40134: checking for any_errors_fatal 15621 1726882574.40139: done checking for any_errors_fatal 15621 1726882574.40139: checking for max_fail_percentage 15621 1726882574.40141: done checking for max_fail_percentage 15621 1726882574.40142: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.40143: done checking to see if all hosts have failed 15621 1726882574.40143: getting the remaining hosts for this loop 15621 1726882574.40144: done getting the remaining hosts for this loop 15621 1726882574.40148: getting the next task for host managed_node3 15621 1726882574.40155: done getting next task for host managed_node3 15621 1726882574.40158: ^ task is: TASK: Enable EPEL 6 15621 1726882574.40161: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.40164: getting variables 15621 1726882574.40165: in VariableManager get_vars() 15621 1726882574.40190: Calling all_inventory to load vars for managed_node3 15621 1726882574.40193: Calling groups_inventory to load vars for managed_node3 15621 1726882574.40196: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.40205: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.40208: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.40211: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.40360: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000b0 15621 1726882574.40364: WORKER PROCESS EXITING 15621 1726882574.40387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.40593: done with get_vars() 15621 1726882574.40602: done getting variables 15621 1726882574.40661: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:36:14 -0400 (0:00:00.020) 0:00:06.486 ****** 15621 1726882574.40690: entering _queue_task() for managed_node3/copy 15621 1726882574.40908: worker is 1 (out of 1 available) 15621 1726882574.40921: exiting _queue_task() for managed_node3/copy 15621 1726882574.41133: done queuing things up, now waiting for results queue to drain 15621 1726882574.41135: waiting for pending results... 15621 1726882574.41168: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 15621 1726882574.41279: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000b2 15621 1726882574.41296: variable 'ansible_search_path' from source: unknown 15621 1726882574.41304: variable 'ansible_search_path' from source: unknown 15621 1726882574.41348: calling self._execute() 15621 1726882574.41426: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.41438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.41452: variable 'omit' from source: magic vars 15621 1726882574.41840: variable 'ansible_distribution' from source: facts 15621 1726882574.41857: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 15621 1726882574.41864: when evaluation is False, skipping this task 15621 1726882574.41872: _execute() done 15621 1726882574.41879: dumping result to json 15621 1726882574.41887: done dumping result, returning 15621 1726882574.41900: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affc7ec-ae25-af1a-5b92-0000000000b2] 15621 1726882574.41910: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000b2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 15621 1726882574.42067: no more pending results, returning what we have 15621 1726882574.42071: results queue empty 15621 1726882574.42072: checking for any_errors_fatal 15621 1726882574.42077: done checking for any_errors_fatal 15621 1726882574.42078: checking for max_fail_percentage 15621 1726882574.42079: done checking for max_fail_percentage 15621 1726882574.42080: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.42082: done checking to see if all hosts have failed 15621 1726882574.42082: getting the remaining hosts for this loop 15621 1726882574.42084: done getting the remaining hosts for this loop 15621 1726882574.42088: getting the next task for host managed_node3 15621 1726882574.42098: done getting next task for host managed_node3 15621 1726882574.42101: ^ task is: TASK: Set network provider to 'nm' 15621 1726882574.42104: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.42108: getting variables 15621 1726882574.42109: in VariableManager get_vars() 15621 1726882574.42142: Calling all_inventory to load vars for managed_node3 15621 1726882574.42145: Calling groups_inventory to load vars for managed_node3 15621 1726882574.42149: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.42164: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.42167: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.42171: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.42564: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000b2 15621 1726882574.42568: WORKER PROCESS EXITING 15621 1726882574.42592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.42786: done with get_vars() 15621 1726882574.42795: done getting variables 15621 1726882574.42855: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 21:36:14 -0400 (0:00:00.021) 0:00:06.508 ****** 15621 1726882574.42882: entering _queue_task() for managed_node3/set_fact 15621 1726882574.43114: worker is 1 (out of 1 available) 15621 1726882574.43128: exiting _queue_task() for managed_node3/set_fact 15621 1726882574.43140: done queuing things up, now waiting for results queue to drain 15621 1726882574.43141: waiting for pending results... 15621 1726882574.43383: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 15621 1726882574.43474: in run() - task 0affc7ec-ae25-af1a-5b92-000000000007 15621 1726882574.43497: variable 'ansible_search_path' from source: unknown 15621 1726882574.43544: calling self._execute() 15621 1726882574.43624: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.43638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.43652: variable 'omit' from source: magic vars 15621 1726882574.43763: variable 'omit' from source: magic vars 15621 1726882574.43801: variable 'omit' from source: magic vars 15621 1726882574.43848: variable 'omit' from source: magic vars 15621 1726882574.43893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882574.43941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882574.43967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882574.43992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.44008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.44047: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882574.44057: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.44064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.44181: Set connection var ansible_connection to ssh 15621 1726882574.44197: Set connection var ansible_shell_executable to /bin/sh 15621 1726882574.44208: Set connection var ansible_timeout to 10 15621 1726882574.44215: Set connection var ansible_shell_type to sh 15621 1726882574.44226: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882574.44235: Set connection var ansible_pipelining to False 15621 1726882574.44327: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.44331: variable 'ansible_connection' from source: unknown 15621 1726882574.44333: variable 'ansible_module_compression' from source: unknown 15621 1726882574.44335: variable 'ansible_shell_type' from source: unknown 15621 1726882574.44337: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.44340: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.44342: variable 'ansible_pipelining' from source: unknown 15621 1726882574.44344: variable 'ansible_timeout' from source: unknown 15621 1726882574.44346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.44462: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882574.44483: variable 'omit' from source: magic vars 15621 1726882574.44493: starting attempt loop 15621 1726882574.44499: running the handler 15621 1726882574.44515: handler run complete 15621 1726882574.44531: attempt loop complete, returning result 15621 1726882574.44538: _execute() done 15621 1726882574.44544: dumping result to json 15621 1726882574.44591: done dumping result, returning 15621 1726882574.44594: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affc7ec-ae25-af1a-5b92-000000000007] 15621 1726882574.44596: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15621 1726882574.44751: no more pending results, returning what we have 15621 1726882574.44754: results queue empty 15621 1726882574.44754: checking for any_errors_fatal 15621 1726882574.44763: done checking for any_errors_fatal 15621 1726882574.44764: checking for max_fail_percentage 15621 1726882574.44765: done checking for max_fail_percentage 15621 1726882574.44766: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.44768: done checking to see if all hosts have failed 15621 1726882574.44769: getting the remaining hosts for this loop 15621 1726882574.44770: done getting the remaining hosts for this loop 15621 1726882574.44774: getting the next task for host managed_node3 15621 1726882574.44781: done getting next task for host managed_node3 15621 1726882574.44783: ^ task is: TASK: meta (flush_handlers) 15621 1726882574.44784: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.44789: getting variables 15621 1726882574.44791: in VariableManager get_vars() 15621 1726882574.44820: Calling all_inventory to load vars for managed_node3 15621 1726882574.44826: Calling groups_inventory to load vars for managed_node3 15621 1726882574.44830: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.44842: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.44846: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.44849: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.45159: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000007 15621 1726882574.45162: WORKER PROCESS EXITING 15621 1726882574.45186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.45386: done with get_vars() 15621 1726882574.45395: done getting variables 15621 1726882574.45462: in VariableManager get_vars() 15621 1726882574.45470: Calling all_inventory to load vars for managed_node3 15621 1726882574.45472: Calling groups_inventory to load vars for managed_node3 15621 1726882574.45475: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.45479: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.45481: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.45484: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.45807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.46001: done with get_vars() 15621 1726882574.46013: done queuing things up, now waiting for results queue to drain 15621 1726882574.46015: results queue empty 15621 1726882574.46016: checking for any_errors_fatal 15621 1726882574.46018: done checking for any_errors_fatal 15621 1726882574.46019: checking for max_fail_percentage 15621 1726882574.46020: done checking for max_fail_percentage 15621 1726882574.46020: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.46021: done checking to see if all hosts have failed 15621 1726882574.46024: getting the remaining hosts for this loop 15621 1726882574.46025: done getting the remaining hosts for this loop 15621 1726882574.46027: getting the next task for host managed_node3 15621 1726882574.46031: done getting next task for host managed_node3 15621 1726882574.46032: ^ task is: TASK: meta (flush_handlers) 15621 1726882574.46034: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.46040: getting variables 15621 1726882574.46041: in VariableManager get_vars() 15621 1726882574.46049: Calling all_inventory to load vars for managed_node3 15621 1726882574.46051: Calling groups_inventory to load vars for managed_node3 15621 1726882574.46053: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.46058: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.46060: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.46063: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.46199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.46393: done with get_vars() 15621 1726882574.46400: done getting variables 15621 1726882574.46448: in VariableManager get_vars() 15621 1726882574.46456: Calling all_inventory to load vars for managed_node3 15621 1726882574.46459: Calling groups_inventory to load vars for managed_node3 15621 1726882574.46461: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.46465: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.46468: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.46471: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.46598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.46793: done with get_vars() 15621 1726882574.46803: done queuing things up, now waiting for results queue to drain 15621 1726882574.46804: results queue empty 15621 1726882574.46805: checking for any_errors_fatal 15621 1726882574.46806: done checking for any_errors_fatal 15621 1726882574.46806: checking for max_fail_percentage 15621 1726882574.46807: done checking for max_fail_percentage 15621 1726882574.46808: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.46808: done checking to see if all hosts have failed 15621 1726882574.46809: getting the remaining hosts for this loop 15621 1726882574.46810: done getting the remaining hosts for this loop 15621 1726882574.46812: getting the next task for host managed_node3 15621 1726882574.46814: done getting next task for host managed_node3 15621 1726882574.46815: ^ task is: None 15621 1726882574.46816: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.46817: done queuing things up, now waiting for results queue to drain 15621 1726882574.46817: results queue empty 15621 1726882574.46818: checking for any_errors_fatal 15621 1726882574.46819: done checking for any_errors_fatal 15621 1726882574.46819: checking for max_fail_percentage 15621 1726882574.46820: done checking for max_fail_percentage 15621 1726882574.46821: checking to see if all hosts have failed and the running result is not ok 15621 1726882574.46824: done checking to see if all hosts have failed 15621 1726882574.46825: getting the next task for host managed_node3 15621 1726882574.46827: done getting next task for host managed_node3 15621 1726882574.46828: ^ task is: None 15621 1726882574.46829: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.46868: in VariableManager get_vars() 15621 1726882574.46881: done with get_vars() 15621 1726882574.46885: in VariableManager get_vars() 15621 1726882574.46892: done with get_vars() 15621 1726882574.46896: variable 'omit' from source: magic vars 15621 1726882574.46924: in VariableManager get_vars() 15621 1726882574.46932: done with get_vars() 15621 1726882574.46949: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 15621 1726882574.47131: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882574.47157: getting the remaining hosts for this loop 15621 1726882574.47158: done getting the remaining hosts for this loop 15621 1726882574.47161: getting the next task for host managed_node3 15621 1726882574.47163: done getting next task for host managed_node3 15621 1726882574.47165: ^ task is: TASK: Gathering Facts 15621 1726882574.47167: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882574.47169: getting variables 15621 1726882574.47170: in VariableManager get_vars() 15621 1726882574.47178: Calling all_inventory to load vars for managed_node3 15621 1726882574.47180: Calling groups_inventory to load vars for managed_node3 15621 1726882574.47182: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882574.47187: Calling all_plugins_play to load vars for managed_node3 15621 1726882574.47201: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882574.47204: Calling groups_plugins_play to load vars for managed_node3 15621 1726882574.47344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882574.47526: done with get_vars() 15621 1726882574.47534: done getting variables 15621 1726882574.47573: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 21:36:14 -0400 (0:00:00.047) 0:00:06.555 ****** 15621 1726882574.47595: entering _queue_task() for managed_node3/gather_facts 15621 1726882574.47824: worker is 1 (out of 1 available) 15621 1726882574.47837: exiting _queue_task() for managed_node3/gather_facts 15621 1726882574.47849: done queuing things up, now waiting for results queue to drain 15621 1726882574.47851: waiting for pending results... 15621 1726882574.48239: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882574.48244: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000d8 15621 1726882574.48247: variable 'ansible_search_path' from source: unknown 15621 1726882574.48267: calling self._execute() 15621 1726882574.48346: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.48359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.48372: variable 'omit' from source: magic vars 15621 1726882574.48758: variable 'ansible_distribution_major_version' from source: facts 15621 1726882574.48779: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882574.48788: variable 'omit' from source: magic vars 15621 1726882574.48813: variable 'omit' from source: magic vars 15621 1726882574.48851: variable 'omit' from source: magic vars 15621 1726882574.48898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882574.48939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882574.48963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882574.48986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.49004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882574.49040: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882574.49101: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.49105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.49165: Set connection var ansible_connection to ssh 15621 1726882574.49181: Set connection var ansible_shell_executable to /bin/sh 15621 1726882574.49191: Set connection var ansible_timeout to 10 15621 1726882574.49197: Set connection var ansible_shell_type to sh 15621 1726882574.49211: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882574.49220: Set connection var ansible_pipelining to False 15621 1726882574.49252: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.49259: variable 'ansible_connection' from source: unknown 15621 1726882574.49266: variable 'ansible_module_compression' from source: unknown 15621 1726882574.49274: variable 'ansible_shell_type' from source: unknown 15621 1726882574.49281: variable 'ansible_shell_executable' from source: unknown 15621 1726882574.49319: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882574.49324: variable 'ansible_pipelining' from source: unknown 15621 1726882574.49327: variable 'ansible_timeout' from source: unknown 15621 1726882574.49329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882574.49497: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882574.49514: variable 'omit' from source: magic vars 15621 1726882574.49538: starting attempt loop 15621 1726882574.49542: running the handler 15621 1726882574.49555: variable 'ansible_facts' from source: unknown 15621 1726882574.49629: _low_level_execute_command(): starting 15621 1726882574.49632: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882574.50429: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.50461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882574.50481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882574.50508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.50642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.52382: stdout chunk (state=3): >>>/root <<< 15621 1726882574.52580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882574.52584: stdout chunk (state=3): >>><<< 15621 1726882574.52587: stderr chunk (state=3): >>><<< 15621 1726882574.52611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882574.52704: _low_level_execute_command(): starting 15621 1726882574.52708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131 `" && echo ansible-tmp-1726882574.5261722-15889-125338850094131="` echo /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131 `" ) && sleep 0' 15621 1726882574.53273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882574.53289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882574.53303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882574.53392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.53436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882574.53452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882574.53471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.53585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.55584: stdout chunk (state=3): >>>ansible-tmp-1726882574.5261722-15889-125338850094131=/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131 <<< 15621 1726882574.55765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882574.55794: stdout chunk (state=3): >>><<< 15621 1726882574.55798: stderr chunk (state=3): >>><<< 15621 1726882574.56028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882574.5261722-15889-125338850094131=/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882574.56032: variable 'ansible_module_compression' from source: unknown 15621 1726882574.56035: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882574.56038: variable 'ansible_facts' from source: unknown 15621 1726882574.56172: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py 15621 1726882574.56559: Sending initial data 15621 1726882574.56570: Sent initial data (154 bytes) 15621 1726882574.57239: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.57277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882574.57296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882574.57317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.57440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.59077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882574.59161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882574.59245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7sm0rc6m /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py <<< 15621 1726882574.59254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py" <<< 15621 1726882574.59334: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7sm0rc6m" to remote "/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py" <<< 15621 1726882574.61038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882574.61095: stderr chunk (state=3): >>><<< 15621 1726882574.61099: stdout chunk (state=3): >>><<< 15621 1726882574.61119: done transferring module to remote 15621 1726882574.61131: _low_level_execute_command(): starting 15621 1726882574.61136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/ /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py && sleep 0' 15621 1726882574.61563: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882574.61567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882574.61573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882574.61575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882574.61578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.61623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882574.61637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.61715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882574.63533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882574.63587: stderr chunk (state=3): >>><<< 15621 1726882574.63592: stdout chunk (state=3): >>><<< 15621 1726882574.63628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882574.63631: _low_level_execute_command(): starting 15621 1726882574.63634: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/AnsiballZ_setup.py && sleep 0' 15621 1726882574.64043: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882574.64048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.64051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882574.64053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882574.64104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882574.64107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882574.64198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882576.76935: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_n<<< 15621 1726882576.76942: stdout chunk (state=3): >>>umber": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "16", "epoch": "1726882576", "epoch_int": "1726882576", "date": "2024-09-20", "time": "21:36:16", "iso8601_micro": "2024-09-21T01:36:16.404378Z", "iso8601": "2024-09-21T01:36:16Z", "iso8601_basic": "20240920T213616404378", "iso8601_basic_short": "20240920T213616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3096, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 620, "free": 3096}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384430592, "block_size": 4096, "block_total": 64483404, "block_available": 61373152, "block_used": 3110252, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.85498046875, "5m": 0.6611328125, "15m": 0.32421875}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882576.78935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882576.79001: stderr chunk (state=3): >>><<< 15621 1726882576.79004: stdout chunk (state=3): >>><<< 15621 1726882576.79034: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "16", "epoch": "1726882576", "epoch_int": "1726882576", "date": "2024-09-20", "time": "21:36:16", "iso8601_micro": "2024-09-21T01:36:16.404378Z", "iso8601": "2024-09-21T01:36:16Z", "iso8601_basic": "20240920T213616404378", "iso8601_basic_short": "20240920T213616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3096, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 620, "free": 3096}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384430592, "block_size": 4096, "block_total": 64483404, "block_available": 61373152, "block_used": 3110252, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.85498046875, "5m": 0.6611328125, "15m": 0.32421875}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882576.79209: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882576.79232: _low_level_execute_command(): starting 15621 1726882576.79237: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882574.5261722-15889-125338850094131/ > /dev/null 2>&1 && sleep 0' 15621 1726882576.79719: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882576.79726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882576.79729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.79731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882576.79733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.79794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882576.79801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882576.79803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882576.79881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882576.81788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882576.81828: stderr chunk (state=3): >>><<< 15621 1726882576.81832: stdout chunk (state=3): >>><<< 15621 1726882576.81847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882576.81854: handler run complete 15621 1726882576.81942: variable 'ansible_facts' from source: unknown 15621 1726882576.82012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.82203: variable 'ansible_facts' from source: unknown 15621 1726882576.82260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.82342: attempt loop complete, returning result 15621 1726882576.82346: _execute() done 15621 1726882576.82349: dumping result to json 15621 1726882576.82365: done dumping result, returning 15621 1726882576.82376: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-0000000000d8] 15621 1726882576.82382: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000d8 15621 1726882576.82651: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000d8 15621 1726882576.82654: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882576.82856: no more pending results, returning what we have 15621 1726882576.82859: results queue empty 15621 1726882576.82859: checking for any_errors_fatal 15621 1726882576.82860: done checking for any_errors_fatal 15621 1726882576.82861: checking for max_fail_percentage 15621 1726882576.82862: done checking for max_fail_percentage 15621 1726882576.82862: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.82863: done checking to see if all hosts have failed 15621 1726882576.82863: getting the remaining hosts for this loop 15621 1726882576.82864: done getting the remaining hosts for this loop 15621 1726882576.82866: getting the next task for host managed_node3 15621 1726882576.82870: done getting next task for host managed_node3 15621 1726882576.82872: ^ task is: TASK: meta (flush_handlers) 15621 1726882576.82873: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.82875: getting variables 15621 1726882576.82876: in VariableManager get_vars() 15621 1726882576.82894: Calling all_inventory to load vars for managed_node3 15621 1726882576.82896: Calling groups_inventory to load vars for managed_node3 15621 1726882576.82898: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.82906: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.82908: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.82910: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.83010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.83135: done with get_vars() 15621 1726882576.83141: done getting variables 15621 1726882576.83194: in VariableManager get_vars() 15621 1726882576.83201: Calling all_inventory to load vars for managed_node3 15621 1726882576.83202: Calling groups_inventory to load vars for managed_node3 15621 1726882576.83204: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.83207: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.83208: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.83210: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.83294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.83406: done with get_vars() 15621 1726882576.83414: done queuing things up, now waiting for results queue to drain 15621 1726882576.83416: results queue empty 15621 1726882576.83416: checking for any_errors_fatal 15621 1726882576.83418: done checking for any_errors_fatal 15621 1726882576.83425: checking for max_fail_percentage 15621 1726882576.83426: done checking for max_fail_percentage 15621 1726882576.83427: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.83427: done checking to see if all hosts have failed 15621 1726882576.83428: getting the remaining hosts for this loop 15621 1726882576.83429: done getting the remaining hosts for this loop 15621 1726882576.83430: getting the next task for host managed_node3 15621 1726882576.83433: done getting next task for host managed_node3 15621 1726882576.83435: ^ task is: TASK: Show inside ethernet tests 15621 1726882576.83435: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.83437: getting variables 15621 1726882576.83438: in VariableManager get_vars() 15621 1726882576.83443: Calling all_inventory to load vars for managed_node3 15621 1726882576.83445: Calling groups_inventory to load vars for managed_node3 15621 1726882576.83446: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.83449: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.83451: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.83452: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.83540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.83651: done with get_vars() 15621 1726882576.83656: done getting variables 15621 1726882576.83715: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 21:36:16 -0400 (0:00:02.361) 0:00:08.916 ****** 15621 1726882576.83735: entering _queue_task() for managed_node3/debug 15621 1726882576.83737: Creating lock for debug 15621 1726882576.83946: worker is 1 (out of 1 available) 15621 1726882576.83960: exiting _queue_task() for managed_node3/debug 15621 1726882576.83974: done queuing things up, now waiting for results queue to drain 15621 1726882576.83976: waiting for pending results... 15621 1726882576.84120: running TaskExecutor() for managed_node3/TASK: Show inside ethernet tests 15621 1726882576.84179: in run() - task 0affc7ec-ae25-af1a-5b92-00000000000b 15621 1726882576.84192: variable 'ansible_search_path' from source: unknown 15621 1726882576.84226: calling self._execute() 15621 1726882576.84317: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.84321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.84323: variable 'omit' from source: magic vars 15621 1726882576.84650: variable 'ansible_distribution_major_version' from source: facts 15621 1726882576.84661: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882576.84667: variable 'omit' from source: magic vars 15621 1726882576.84689: variable 'omit' from source: magic vars 15621 1726882576.84714: variable 'omit' from source: magic vars 15621 1726882576.84752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882576.84779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882576.84795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882576.84809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.84819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.84845: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882576.84848: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.84854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.84926: Set connection var ansible_connection to ssh 15621 1726882576.84934: Set connection var ansible_shell_executable to /bin/sh 15621 1726882576.84940: Set connection var ansible_timeout to 10 15621 1726882576.84943: Set connection var ansible_shell_type to sh 15621 1726882576.84948: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882576.84953: Set connection var ansible_pipelining to False 15621 1726882576.84987: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.84991: variable 'ansible_connection' from source: unknown 15621 1726882576.84993: variable 'ansible_module_compression' from source: unknown 15621 1726882576.84996: variable 'ansible_shell_type' from source: unknown 15621 1726882576.84999: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.85001: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.85003: variable 'ansible_pipelining' from source: unknown 15621 1726882576.85005: variable 'ansible_timeout' from source: unknown 15621 1726882576.85007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.85099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882576.85107: variable 'omit' from source: magic vars 15621 1726882576.85111: starting attempt loop 15621 1726882576.85114: running the handler 15621 1726882576.85151: handler run complete 15621 1726882576.85169: attempt loop complete, returning result 15621 1726882576.85175: _execute() done 15621 1726882576.85177: dumping result to json 15621 1726882576.85180: done dumping result, returning 15621 1726882576.85184: done running TaskExecutor() for managed_node3/TASK: Show inside ethernet tests [0affc7ec-ae25-af1a-5b92-00000000000b] 15621 1726882576.85192: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000b 15621 1726882576.85278: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000b 15621 1726882576.85281: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Inside ethernet tests 15621 1726882576.85340: no more pending results, returning what we have 15621 1726882576.85342: results queue empty 15621 1726882576.85343: checking for any_errors_fatal 15621 1726882576.85345: done checking for any_errors_fatal 15621 1726882576.85345: checking for max_fail_percentage 15621 1726882576.85346: done checking for max_fail_percentage 15621 1726882576.85347: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.85348: done checking to see if all hosts have failed 15621 1726882576.85349: getting the remaining hosts for this loop 15621 1726882576.85350: done getting the remaining hosts for this loop 15621 1726882576.85353: getting the next task for host managed_node3 15621 1726882576.85356: done getting next task for host managed_node3 15621 1726882576.85359: ^ task is: TASK: Show network_provider 15621 1726882576.85360: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.85364: getting variables 15621 1726882576.85365: in VariableManager get_vars() 15621 1726882576.85440: Calling all_inventory to load vars for managed_node3 15621 1726882576.85442: Calling groups_inventory to load vars for managed_node3 15621 1726882576.85445: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.85452: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.85453: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.85455: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.85551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.85672: done with get_vars() 15621 1726882576.85678: done getting variables 15621 1726882576.85719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 21:36:16 -0400 (0:00:00.020) 0:00:08.936 ****** 15621 1726882576.85739: entering _queue_task() for managed_node3/debug 15621 1726882576.85925: worker is 1 (out of 1 available) 15621 1726882576.85938: exiting _queue_task() for managed_node3/debug 15621 1726882576.85950: done queuing things up, now waiting for results queue to drain 15621 1726882576.85952: waiting for pending results... 15621 1726882576.86095: running TaskExecutor() for managed_node3/TASK: Show network_provider 15621 1726882576.86149: in run() - task 0affc7ec-ae25-af1a-5b92-00000000000c 15621 1726882576.86161: variable 'ansible_search_path' from source: unknown 15621 1726882576.86193: calling self._execute() 15621 1726882576.86253: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.86258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.86267: variable 'omit' from source: magic vars 15621 1726882576.86528: variable 'ansible_distribution_major_version' from source: facts 15621 1726882576.86538: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882576.86543: variable 'omit' from source: magic vars 15621 1726882576.86564: variable 'omit' from source: magic vars 15621 1726882576.86590: variable 'omit' from source: magic vars 15621 1726882576.86622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882576.86653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882576.86667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882576.86682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.86693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.86714: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882576.86719: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.86722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.86793: Set connection var ansible_connection to ssh 15621 1726882576.86801: Set connection var ansible_shell_executable to /bin/sh 15621 1726882576.86807: Set connection var ansible_timeout to 10 15621 1726882576.86810: Set connection var ansible_shell_type to sh 15621 1726882576.86815: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882576.86820: Set connection var ansible_pipelining to False 15621 1726882576.86844: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.86848: variable 'ansible_connection' from source: unknown 15621 1726882576.86852: variable 'ansible_module_compression' from source: unknown 15621 1726882576.86855: variable 'ansible_shell_type' from source: unknown 15621 1726882576.86857: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.86860: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.86862: variable 'ansible_pipelining' from source: unknown 15621 1726882576.86864: variable 'ansible_timeout' from source: unknown 15621 1726882576.86867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.86969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882576.86980: variable 'omit' from source: magic vars 15621 1726882576.86983: starting attempt loop 15621 1726882576.86987: running the handler 15621 1726882576.87020: variable 'network_provider' from source: set_fact 15621 1726882576.87103: variable 'network_provider' from source: set_fact 15621 1726882576.87107: handler run complete 15621 1726882576.87110: attempt loop complete, returning result 15621 1726882576.87113: _execute() done 15621 1726882576.87115: dumping result to json 15621 1726882576.87118: done dumping result, returning 15621 1726882576.87121: done running TaskExecutor() for managed_node3/TASK: Show network_provider [0affc7ec-ae25-af1a-5b92-00000000000c] 15621 1726882576.87125: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000c 15621 1726882576.87209: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000c 15621 1726882576.87212: WORKER PROCESS EXITING ok: [managed_node3] => { "network_provider": "nm" } 15621 1726882576.87259: no more pending results, returning what we have 15621 1726882576.87262: results queue empty 15621 1726882576.87263: checking for any_errors_fatal 15621 1726882576.87267: done checking for any_errors_fatal 15621 1726882576.87268: checking for max_fail_percentage 15621 1726882576.87269: done checking for max_fail_percentage 15621 1726882576.87272: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.87273: done checking to see if all hosts have failed 15621 1726882576.87274: getting the remaining hosts for this loop 15621 1726882576.87275: done getting the remaining hosts for this loop 15621 1726882576.87278: getting the next task for host managed_node3 15621 1726882576.87283: done getting next task for host managed_node3 15621 1726882576.87284: ^ task is: TASK: meta (flush_handlers) 15621 1726882576.87286: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.87289: getting variables 15621 1726882576.87290: in VariableManager get_vars() 15621 1726882576.87313: Calling all_inventory to load vars for managed_node3 15621 1726882576.87316: Calling groups_inventory to load vars for managed_node3 15621 1726882576.87318: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.87334: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.87336: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.87339: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.87445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.87577: done with get_vars() 15621 1726882576.87583: done getting variables 15621 1726882576.87628: in VariableManager get_vars() 15621 1726882576.87634: Calling all_inventory to load vars for managed_node3 15621 1726882576.87636: Calling groups_inventory to load vars for managed_node3 15621 1726882576.87637: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.87640: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.87642: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.87643: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.87729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.87839: done with get_vars() 15621 1726882576.87848: done queuing things up, now waiting for results queue to drain 15621 1726882576.87849: results queue empty 15621 1726882576.87849: checking for any_errors_fatal 15621 1726882576.87851: done checking for any_errors_fatal 15621 1726882576.87851: checking for max_fail_percentage 15621 1726882576.87852: done checking for max_fail_percentage 15621 1726882576.87852: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.87853: done checking to see if all hosts have failed 15621 1726882576.87853: getting the remaining hosts for this loop 15621 1726882576.87854: done getting the remaining hosts for this loop 15621 1726882576.87856: getting the next task for host managed_node3 15621 1726882576.87861: done getting next task for host managed_node3 15621 1726882576.87862: ^ task is: TASK: meta (flush_handlers) 15621 1726882576.87863: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.87865: getting variables 15621 1726882576.87866: in VariableManager get_vars() 15621 1726882576.87874: Calling all_inventory to load vars for managed_node3 15621 1726882576.87876: Calling groups_inventory to load vars for managed_node3 15621 1726882576.87878: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.87881: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.87883: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.87885: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.87964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.88093: done with get_vars() 15621 1726882576.88098: done getting variables 15621 1726882576.88130: in VariableManager get_vars() 15621 1726882576.88135: Calling all_inventory to load vars for managed_node3 15621 1726882576.88137: Calling groups_inventory to load vars for managed_node3 15621 1726882576.88138: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.88141: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.88142: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.88144: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.88227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.88338: done with get_vars() 15621 1726882576.88345: done queuing things up, now waiting for results queue to drain 15621 1726882576.88347: results queue empty 15621 1726882576.88347: checking for any_errors_fatal 15621 1726882576.88348: done checking for any_errors_fatal 15621 1726882576.88348: checking for max_fail_percentage 15621 1726882576.88349: done checking for max_fail_percentage 15621 1726882576.88350: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.88350: done checking to see if all hosts have failed 15621 1726882576.88350: getting the remaining hosts for this loop 15621 1726882576.88351: done getting the remaining hosts for this loop 15621 1726882576.88353: getting the next task for host managed_node3 15621 1726882576.88354: done getting next task for host managed_node3 15621 1726882576.88355: ^ task is: None 15621 1726882576.88356: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.88357: done queuing things up, now waiting for results queue to drain 15621 1726882576.88357: results queue empty 15621 1726882576.88358: checking for any_errors_fatal 15621 1726882576.88358: done checking for any_errors_fatal 15621 1726882576.88359: checking for max_fail_percentage 15621 1726882576.88359: done checking for max_fail_percentage 15621 1726882576.88360: checking to see if all hosts have failed and the running result is not ok 15621 1726882576.88360: done checking to see if all hosts have failed 15621 1726882576.88361: getting the next task for host managed_node3 15621 1726882576.88363: done getting next task for host managed_node3 15621 1726882576.88363: ^ task is: None 15621 1726882576.88364: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.88394: in VariableManager get_vars() 15621 1726882576.88405: done with get_vars() 15621 1726882576.88408: in VariableManager get_vars() 15621 1726882576.88415: done with get_vars() 15621 1726882576.88419: variable 'omit' from source: magic vars 15621 1726882576.88442: in VariableManager get_vars() 15621 1726882576.88449: done with get_vars() 15621 1726882576.88461: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 15621 1726882576.88580: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882576.88598: getting the remaining hosts for this loop 15621 1726882576.88599: done getting the remaining hosts for this loop 15621 1726882576.88601: getting the next task for host managed_node3 15621 1726882576.88603: done getting next task for host managed_node3 15621 1726882576.88604: ^ task is: TASK: Gathering Facts 15621 1726882576.88605: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882576.88606: getting variables 15621 1726882576.88607: in VariableManager get_vars() 15621 1726882576.88612: Calling all_inventory to load vars for managed_node3 15621 1726882576.88613: Calling groups_inventory to load vars for managed_node3 15621 1726882576.88615: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882576.88618: Calling all_plugins_play to load vars for managed_node3 15621 1726882576.88619: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882576.88623: Calling groups_plugins_play to load vars for managed_node3 15621 1726882576.88732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882576.88841: done with get_vars() 15621 1726882576.88847: done getting variables 15621 1726882576.88877: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 21:36:16 -0400 (0:00:00.031) 0:00:08.968 ****** 15621 1726882576.88895: entering _queue_task() for managed_node3/gather_facts 15621 1726882576.89072: worker is 1 (out of 1 available) 15621 1726882576.89086: exiting _queue_task() for managed_node3/gather_facts 15621 1726882576.89097: done queuing things up, now waiting for results queue to drain 15621 1726882576.89099: waiting for pending results... 15621 1726882576.89242: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882576.89299: in run() - task 0affc7ec-ae25-af1a-5b92-0000000000f0 15621 1726882576.89310: variable 'ansible_search_path' from source: unknown 15621 1726882576.89345: calling self._execute() 15621 1726882576.89396: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.89402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.89410: variable 'omit' from source: magic vars 15621 1726882576.89685: variable 'ansible_distribution_major_version' from source: facts 15621 1726882576.89695: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882576.89700: variable 'omit' from source: magic vars 15621 1726882576.89718: variable 'omit' from source: magic vars 15621 1726882576.89746: variable 'omit' from source: magic vars 15621 1726882576.89777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882576.89810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882576.89825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882576.89841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.89851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882576.89881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882576.89884: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.89891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.89961: Set connection var ansible_connection to ssh 15621 1726882576.89968: Set connection var ansible_shell_executable to /bin/sh 15621 1726882576.89978: Set connection var ansible_timeout to 10 15621 1726882576.89981: Set connection var ansible_shell_type to sh 15621 1726882576.89984: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882576.89991: Set connection var ansible_pipelining to False 15621 1726882576.90012: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.90015: variable 'ansible_connection' from source: unknown 15621 1726882576.90017: variable 'ansible_module_compression' from source: unknown 15621 1726882576.90020: variable 'ansible_shell_type' from source: unknown 15621 1726882576.90025: variable 'ansible_shell_executable' from source: unknown 15621 1726882576.90027: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882576.90032: variable 'ansible_pipelining' from source: unknown 15621 1726882576.90034: variable 'ansible_timeout' from source: unknown 15621 1726882576.90039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882576.90179: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882576.90190: variable 'omit' from source: magic vars 15621 1726882576.90195: starting attempt loop 15621 1726882576.90197: running the handler 15621 1726882576.90214: variable 'ansible_facts' from source: unknown 15621 1726882576.90231: _low_level_execute_command(): starting 15621 1726882576.90238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882576.90774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882576.90778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.90782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882576.90786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.90839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882576.90842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882576.90848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882576.90938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882576.92675: stdout chunk (state=3): >>>/root <<< 15621 1726882576.92785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882576.92834: stderr chunk (state=3): >>><<< 15621 1726882576.92838: stdout chunk (state=3): >>><<< 15621 1726882576.92858: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882576.92869: _low_level_execute_command(): starting 15621 1726882576.92875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805 `" && echo ansible-tmp-1726882576.9285634-15938-101754841462805="` echo /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805 `" ) && sleep 0' 15621 1726882576.93316: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882576.93319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.93330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882576.93332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882576.93335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882576.93376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882576.93382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882576.93474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882576.95467: stdout chunk (state=3): >>>ansible-tmp-1726882576.9285634-15938-101754841462805=/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805 <<< 15621 1726882576.95585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882576.95632: stderr chunk (state=3): >>><<< 15621 1726882576.95635: stdout chunk (state=3): >>><<< 15621 1726882576.95654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882576.9285634-15938-101754841462805=/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882576.95680: variable 'ansible_module_compression' from source: unknown 15621 1726882576.95717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882576.95767: variable 'ansible_facts' from source: unknown 15621 1726882576.95897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py 15621 1726882576.96011: Sending initial data 15621 1726882576.96014: Sent initial data (154 bytes) 15621 1726882576.96608: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882576.96635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882576.96662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882576.96776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882576.98361: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15621 1726882576.98370: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882576.98448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882576.98538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpnpatauxd /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py <<< 15621 1726882576.98549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py" <<< 15621 1726882576.98641: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpnpatauxd" to remote "/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py" <<< 15621 1726882577.00429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882577.00433: stdout chunk (state=3): >>><<< 15621 1726882577.00436: stderr chunk (state=3): >>><<< 15621 1726882577.00438: done transferring module to remote 15621 1726882577.00441: _low_level_execute_command(): starting 15621 1726882577.00443: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/ /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py && sleep 0' 15621 1726882577.01090: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882577.01098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882577.01109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882577.01126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882577.01139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882577.01183: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882577.01254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882577.01290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882577.01405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882577.03216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882577.03268: stderr chunk (state=3): >>><<< 15621 1726882577.03271: stdout chunk (state=3): >>><<< 15621 1726882577.03287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882577.03290: _low_level_execute_command(): starting 15621 1726882577.03295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/AnsiballZ_setup.py && sleep 0' 15621 1726882577.03706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882577.03743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882577.03746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882577.03748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882577.03751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882577.03794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882577.03798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882577.03892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.00672: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "17", "epoch": "1726882577", "epoch_int": "1726882577", "date": "2024-09-20", "time": "21:36:17", "iso8601_micro": "2024-09-21T01:36:17.330980Z", "iso8601": "2024-09-21T01:36:17Z", "iso8601_basic": "20240920T213617330980", "iso8601_basic_short": "20240920T213617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.85498046875, "5m": 0.6611328125, "15m": 0.32421875}, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3105, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 611, "free": 3105}, "nocache": {"free": 3487, "used": 229}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsi<<< 15621 1726882579.00684: stdout chunk (state=3): >>>ze": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384422400, "block_size": 4096, "block_total": 64483404, "block_available": 61373150, "block_used": 3110254, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882579.02856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882579.02860: stdout chunk (state=3): >>><<< 15621 1726882579.02863: stderr chunk (state=3): >>><<< 15621 1726882579.02891: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "17", "epoch": "1726882577", "epoch_int": "1726882577", "date": "2024-09-20", "time": "21:36:17", "iso8601_micro": "2024-09-21T01:36:17.330980Z", "iso8601": "2024-09-21T01:36:17Z", "iso8601_basic": "20240920T213617330980", "iso8601_basic_short": "20240920T213617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.85498046875, "5m": 0.6611328125, "15m": 0.32421875}, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3105, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 611, "free": 3105}, "nocache": {"free": 3487, "used": 229}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384422400, "block_size": 4096, "block_total": 64483404, "block_available": 61373150, "block_used": 3110254, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882579.03242: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882579.03264: _low_level_execute_command(): starting 15621 1726882579.03278: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882576.9285634-15938-101754841462805/ > /dev/null 2>&1 && sleep 0' 15621 1726882579.03713: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.03752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882579.03756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882579.03758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882579.03760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882579.03763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.03810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.03813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.03901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.05915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.05919: stdout chunk (state=3): >>><<< 15621 1726882579.05924: stderr chunk (state=3): >>><<< 15621 1726882579.05941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.06127: handler run complete 15621 1726882579.06131: variable 'ansible_facts' from source: unknown 15621 1726882579.06189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.06529: variable 'ansible_facts' from source: unknown 15621 1726882579.06609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.06687: attempt loop complete, returning result 15621 1726882579.06697: _execute() done 15621 1726882579.06703: dumping result to json 15621 1726882579.06719: done dumping result, returning 15621 1726882579.06729: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-0000000000f0] 15621 1726882579.06744: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000f0 15621 1726882579.06983: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000000f0 15621 1726882579.06987: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882579.07193: no more pending results, returning what we have 15621 1726882579.07195: results queue empty 15621 1726882579.07196: checking for any_errors_fatal 15621 1726882579.07197: done checking for any_errors_fatal 15621 1726882579.07197: checking for max_fail_percentage 15621 1726882579.07198: done checking for max_fail_percentage 15621 1726882579.07199: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.07199: done checking to see if all hosts have failed 15621 1726882579.07200: getting the remaining hosts for this loop 15621 1726882579.07201: done getting the remaining hosts for this loop 15621 1726882579.07203: getting the next task for host managed_node3 15621 1726882579.07207: done getting next task for host managed_node3 15621 1726882579.07208: ^ task is: TASK: meta (flush_handlers) 15621 1726882579.07209: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.07212: getting variables 15621 1726882579.07213: in VariableManager get_vars() 15621 1726882579.07236: Calling all_inventory to load vars for managed_node3 15621 1726882579.07238: Calling groups_inventory to load vars for managed_node3 15621 1726882579.07241: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.07250: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.07252: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.07254: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.07356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.07473: done with get_vars() 15621 1726882579.07481: done getting variables 15621 1726882579.07535: in VariableManager get_vars() 15621 1726882579.07541: Calling all_inventory to load vars for managed_node3 15621 1726882579.07543: Calling groups_inventory to load vars for managed_node3 15621 1726882579.07544: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.07547: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.07549: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.07551: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.07638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.07761: done with get_vars() 15621 1726882579.07771: done queuing things up, now waiting for results queue to drain 15621 1726882579.07773: results queue empty 15621 1726882579.07774: checking for any_errors_fatal 15621 1726882579.07776: done checking for any_errors_fatal 15621 1726882579.07781: checking for max_fail_percentage 15621 1726882579.07781: done checking for max_fail_percentage 15621 1726882579.07782: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.07782: done checking to see if all hosts have failed 15621 1726882579.07783: getting the remaining hosts for this loop 15621 1726882579.07784: done getting the remaining hosts for this loop 15621 1726882579.07785: getting the next task for host managed_node3 15621 1726882579.07789: done getting next task for host managed_node3 15621 1726882579.07791: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 15621 1726882579.07792: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.07794: getting variables 15621 1726882579.07794: in VariableManager get_vars() 15621 1726882579.07800: Calling all_inventory to load vars for managed_node3 15621 1726882579.07801: Calling groups_inventory to load vars for managed_node3 15621 1726882579.07802: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.07805: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.07807: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.07809: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.07892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.08005: done with get_vars() 15621 1726882579.08011: done getting variables 15621 1726882579.08043: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882579.08153: variable 'type' from source: play vars 15621 1726882579.08157: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 21:36:19 -0400 (0:00:02.192) 0:00:11.161 ****** 15621 1726882579.08188: entering _queue_task() for managed_node3/set_fact 15621 1726882579.08485: worker is 1 (out of 1 available) 15621 1726882579.08499: exiting _queue_task() for managed_node3/set_fact 15621 1726882579.08513: done queuing things up, now waiting for results queue to drain 15621 1726882579.08514: waiting for pending results... 15621 1726882579.08839: running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=lsr27 15621 1726882579.08854: in run() - task 0affc7ec-ae25-af1a-5b92-00000000000f 15621 1726882579.08860: variable 'ansible_search_path' from source: unknown 15621 1726882579.08886: calling self._execute() 15621 1726882579.08971: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.08982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.08993: variable 'omit' from source: magic vars 15621 1726882579.09402: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.09413: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.09419: variable 'omit' from source: magic vars 15621 1726882579.09447: variable 'omit' from source: magic vars 15621 1726882579.09473: variable 'type' from source: play vars 15621 1726882579.09550: variable 'type' from source: play vars 15621 1726882579.09561: variable 'interface' from source: play vars 15621 1726882579.09667: variable 'interface' from source: play vars 15621 1726882579.09688: variable 'omit' from source: magic vars 15621 1726882579.09735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882579.10029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882579.10033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882579.10035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.10038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.10041: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882579.10043: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.10045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.10047: Set connection var ansible_connection to ssh 15621 1726882579.10049: Set connection var ansible_shell_executable to /bin/sh 15621 1726882579.10051: Set connection var ansible_timeout to 10 15621 1726882579.10053: Set connection var ansible_shell_type to sh 15621 1726882579.10054: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882579.10057: Set connection var ansible_pipelining to False 15621 1726882579.10079: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.10088: variable 'ansible_connection' from source: unknown 15621 1726882579.10094: variable 'ansible_module_compression' from source: unknown 15621 1726882579.10101: variable 'ansible_shell_type' from source: unknown 15621 1726882579.10106: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.10112: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.10119: variable 'ansible_pipelining' from source: unknown 15621 1726882579.10131: variable 'ansible_timeout' from source: unknown 15621 1726882579.10138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.10302: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882579.10318: variable 'omit' from source: magic vars 15621 1726882579.10332: starting attempt loop 15621 1726882579.10339: running the handler 15621 1726882579.10356: handler run complete 15621 1726882579.10371: attempt loop complete, returning result 15621 1726882579.10378: _execute() done 15621 1726882579.10393: dumping result to json 15621 1726882579.10402: done dumping result, returning 15621 1726882579.10413: done running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=lsr27 [0affc7ec-ae25-af1a-5b92-00000000000f] 15621 1726882579.10427: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000f ok: [managed_node3] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 15621 1726882579.10741: no more pending results, returning what we have 15621 1726882579.10745: results queue empty 15621 1726882579.10745: checking for any_errors_fatal 15621 1726882579.10747: done checking for any_errors_fatal 15621 1726882579.10748: checking for max_fail_percentage 15621 1726882579.10750: done checking for max_fail_percentage 15621 1726882579.10750: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.10751: done checking to see if all hosts have failed 15621 1726882579.10752: getting the remaining hosts for this loop 15621 1726882579.10753: done getting the remaining hosts for this loop 15621 1726882579.10757: getting the next task for host managed_node3 15621 1726882579.10761: done getting next task for host managed_node3 15621 1726882579.10764: ^ task is: TASK: Include the task 'show_interfaces.yml' 15621 1726882579.10766: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.10769: getting variables 15621 1726882579.10772: in VariableManager get_vars() 15621 1726882579.10800: Calling all_inventory to load vars for managed_node3 15621 1726882579.10802: Calling groups_inventory to load vars for managed_node3 15621 1726882579.10805: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.10816: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.10819: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.10823: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.10983: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000000f 15621 1726882579.10986: WORKER PROCESS EXITING 15621 1726882579.10998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.11117: done with get_vars() 15621 1726882579.11125: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 21:36:19 -0400 (0:00:00.030) 0:00:11.191 ****** 15621 1726882579.11197: entering _queue_task() for managed_node3/include_tasks 15621 1726882579.11419: worker is 1 (out of 1 available) 15621 1726882579.11433: exiting _queue_task() for managed_node3/include_tasks 15621 1726882579.11445: done queuing things up, now waiting for results queue to drain 15621 1726882579.11447: waiting for pending results... 15621 1726882579.11608: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 15621 1726882579.11679: in run() - task 0affc7ec-ae25-af1a-5b92-000000000010 15621 1726882579.11690: variable 'ansible_search_path' from source: unknown 15621 1726882579.11720: calling self._execute() 15621 1726882579.11794: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.11800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.11809: variable 'omit' from source: magic vars 15621 1726882579.12102: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.12113: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.12120: _execute() done 15621 1726882579.12126: dumping result to json 15621 1726882579.12130: done dumping result, returning 15621 1726882579.12137: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-af1a-5b92-000000000010] 15621 1726882579.12143: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000010 15621 1726882579.12232: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000010 15621 1726882579.12235: WORKER PROCESS EXITING 15621 1726882579.12268: no more pending results, returning what we have 15621 1726882579.12273: in VariableManager get_vars() 15621 1726882579.12305: Calling all_inventory to load vars for managed_node3 15621 1726882579.12308: Calling groups_inventory to load vars for managed_node3 15621 1726882579.12310: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.12321: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.12327: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.12330: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.12471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.12591: done with get_vars() 15621 1726882579.12602: variable 'ansible_search_path' from source: unknown 15621 1726882579.12618: we have included files to process 15621 1726882579.12619: generating all_blocks data 15621 1726882579.12620: done generating all_blocks data 15621 1726882579.12621: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.12624: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.12626: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.12774: in VariableManager get_vars() 15621 1726882579.12790: done with get_vars() 15621 1726882579.12902: done processing included file 15621 1726882579.12905: iterating over new_blocks loaded from include file 15621 1726882579.12906: in VariableManager get_vars() 15621 1726882579.12918: done with get_vars() 15621 1726882579.12919: filtering new block on tags 15621 1726882579.12938: done filtering new block on tags 15621 1726882579.12941: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 15621 1726882579.12946: extending task lists for all hosts with included blocks 15621 1726882579.13034: done extending task lists 15621 1726882579.13036: done processing included files 15621 1726882579.13037: results queue empty 15621 1726882579.13038: checking for any_errors_fatal 15621 1726882579.13042: done checking for any_errors_fatal 15621 1726882579.13043: checking for max_fail_percentage 15621 1726882579.13044: done checking for max_fail_percentage 15621 1726882579.13045: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.13046: done checking to see if all hosts have failed 15621 1726882579.13047: getting the remaining hosts for this loop 15621 1726882579.13048: done getting the remaining hosts for this loop 15621 1726882579.13051: getting the next task for host managed_node3 15621 1726882579.13055: done getting next task for host managed_node3 15621 1726882579.13058: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15621 1726882579.13060: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.13062: getting variables 15621 1726882579.13063: in VariableManager get_vars() 15621 1726882579.13071: Calling all_inventory to load vars for managed_node3 15621 1726882579.13073: Calling groups_inventory to load vars for managed_node3 15621 1726882579.13076: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.13081: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.13084: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.13087: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.13260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.13454: done with get_vars() 15621 1726882579.13464: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.023) 0:00:11.214 ****** 15621 1726882579.13543: entering _queue_task() for managed_node3/include_tasks 15621 1726882579.13861: worker is 1 (out of 1 available) 15621 1726882579.13876: exiting _queue_task() for managed_node3/include_tasks 15621 1726882579.13893: done queuing things up, now waiting for results queue to drain 15621 1726882579.13895: waiting for pending results... 15621 1726882579.14076: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 15621 1726882579.14148: in run() - task 0affc7ec-ae25-af1a-5b92-000000000104 15621 1726882579.14161: variable 'ansible_search_path' from source: unknown 15621 1726882579.14164: variable 'ansible_search_path' from source: unknown 15621 1726882579.14200: calling self._execute() 15621 1726882579.14266: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.14271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.14284: variable 'omit' from source: magic vars 15621 1726882579.14578: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.14589: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.14595: _execute() done 15621 1726882579.14599: dumping result to json 15621 1726882579.14604: done dumping result, returning 15621 1726882579.14612: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-af1a-5b92-000000000104] 15621 1726882579.14617: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000104 15621 1726882579.14709: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000104 15621 1726882579.14712: WORKER PROCESS EXITING 15621 1726882579.14747: no more pending results, returning what we have 15621 1726882579.14752: in VariableManager get_vars() 15621 1726882579.14789: Calling all_inventory to load vars for managed_node3 15621 1726882579.14792: Calling groups_inventory to load vars for managed_node3 15621 1726882579.14796: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.14811: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.14813: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.14816: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.14975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.15094: done with get_vars() 15621 1726882579.15100: variable 'ansible_search_path' from source: unknown 15621 1726882579.15101: variable 'ansible_search_path' from source: unknown 15621 1726882579.15134: we have included files to process 15621 1726882579.15134: generating all_blocks data 15621 1726882579.15135: done generating all_blocks data 15621 1726882579.15136: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.15137: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.15139: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.15566: done processing included file 15621 1726882579.15568: iterating over new_blocks loaded from include file 15621 1726882579.15569: in VariableManager get_vars() 15621 1726882579.15579: done with get_vars() 15621 1726882579.15580: filtering new block on tags 15621 1726882579.15594: done filtering new block on tags 15621 1726882579.15596: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 15621 1726882579.15600: extending task lists for all hosts with included blocks 15621 1726882579.15663: done extending task lists 15621 1726882579.15664: done processing included files 15621 1726882579.15664: results queue empty 15621 1726882579.15665: checking for any_errors_fatal 15621 1726882579.15667: done checking for any_errors_fatal 15621 1726882579.15667: checking for max_fail_percentage 15621 1726882579.15668: done checking for max_fail_percentage 15621 1726882579.15669: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.15669: done checking to see if all hosts have failed 15621 1726882579.15670: getting the remaining hosts for this loop 15621 1726882579.15671: done getting the remaining hosts for this loop 15621 1726882579.15673: getting the next task for host managed_node3 15621 1726882579.15676: done getting next task for host managed_node3 15621 1726882579.15677: ^ task is: TASK: Gather current interface info 15621 1726882579.15679: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.15681: getting variables 15621 1726882579.15681: in VariableManager get_vars() 15621 1726882579.15687: Calling all_inventory to load vars for managed_node3 15621 1726882579.15689: Calling groups_inventory to load vars for managed_node3 15621 1726882579.15691: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.15696: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.15698: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.15700: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.15786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.15898: done with get_vars() 15621 1726882579.15904: done getting variables 15621 1726882579.15937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.024) 0:00:11.238 ****** 15621 1726882579.15958: entering _queue_task() for managed_node3/command 15621 1726882579.16211: worker is 1 (out of 1 available) 15621 1726882579.16226: exiting _queue_task() for managed_node3/command 15621 1726882579.16240: done queuing things up, now waiting for results queue to drain 15621 1726882579.16242: waiting for pending results... 15621 1726882579.16414: running TaskExecutor() for managed_node3/TASK: Gather current interface info 15621 1726882579.16497: in run() - task 0affc7ec-ae25-af1a-5b92-000000000115 15621 1726882579.16509: variable 'ansible_search_path' from source: unknown 15621 1726882579.16513: variable 'ansible_search_path' from source: unknown 15621 1726882579.16551: calling self._execute() 15621 1726882579.16614: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.16618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.16630: variable 'omit' from source: magic vars 15621 1726882579.16959: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.16969: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.16975: variable 'omit' from source: magic vars 15621 1726882579.17018: variable 'omit' from source: magic vars 15621 1726882579.17046: variable 'omit' from source: magic vars 15621 1726882579.17080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882579.17110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882579.17128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882579.17145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.17155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.17182: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882579.17186: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.17188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.17265: Set connection var ansible_connection to ssh 15621 1726882579.17275: Set connection var ansible_shell_executable to /bin/sh 15621 1726882579.17278: Set connection var ansible_timeout to 10 15621 1726882579.17281: Set connection var ansible_shell_type to sh 15621 1726882579.17287: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882579.17292: Set connection var ansible_pipelining to False 15621 1726882579.17311: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.17314: variable 'ansible_connection' from source: unknown 15621 1726882579.17317: variable 'ansible_module_compression' from source: unknown 15621 1726882579.17320: variable 'ansible_shell_type' from source: unknown 15621 1726882579.17324: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.17326: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.17329: variable 'ansible_pipelining' from source: unknown 15621 1726882579.17332: variable 'ansible_timeout' from source: unknown 15621 1726882579.17336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.17445: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882579.17452: variable 'omit' from source: magic vars 15621 1726882579.17463: starting attempt loop 15621 1726882579.17466: running the handler 15621 1726882579.17563: _low_level_execute_command(): starting 15621 1726882579.17569: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882579.18032: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.18037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.18049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.18100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.18105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.18112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.18198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.19868: stdout chunk (state=3): >>>/root <<< 15621 1726882579.19982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.20033: stderr chunk (state=3): >>><<< 15621 1726882579.20036: stdout chunk (state=3): >>><<< 15621 1726882579.20056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.20068: _low_level_execute_command(): starting 15621 1726882579.20074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594 `" && echo ansible-tmp-1726882579.2005434-16005-47654547395594="` echo /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594 `" ) && sleep 0' 15621 1726882579.20526: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.20530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.20533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.20542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.20588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.20596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.20599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.20679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.22647: stdout chunk (state=3): >>>ansible-tmp-1726882579.2005434-16005-47654547395594=/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594 <<< 15621 1726882579.22773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.22815: stderr chunk (state=3): >>><<< 15621 1726882579.22818: stdout chunk (state=3): >>><<< 15621 1726882579.22833: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882579.2005434-16005-47654547395594=/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.22858: variable 'ansible_module_compression' from source: unknown 15621 1726882579.22905: ANSIBALLZ: Using generic lock for ansible.legacy.command 15621 1726882579.22908: ANSIBALLZ: Acquiring lock 15621 1726882579.22911: ANSIBALLZ: Lock acquired: 140146888266560 15621 1726882579.22913: ANSIBALLZ: Creating module 15621 1726882579.32431: ANSIBALLZ: Writing module into payload 15621 1726882579.32435: ANSIBALLZ: Writing module 15621 1726882579.32438: ANSIBALLZ: Renaming module 15621 1726882579.32440: ANSIBALLZ: Done creating module 15621 1726882579.32442: variable 'ansible_facts' from source: unknown 15621 1726882579.32498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py 15621 1726882579.32739: Sending initial data 15621 1726882579.32749: Sent initial data (155 bytes) 15621 1726882579.33540: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.33648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.35361: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15621 1726882579.35372: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882579.35445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882579.35535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpdhhihntq /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py <<< 15621 1726882579.35543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py" <<< 15621 1726882579.35623: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpdhhihntq" to remote "/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py" <<< 15621 1726882579.36345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.36409: stderr chunk (state=3): >>><<< 15621 1726882579.36412: stdout chunk (state=3): >>><<< 15621 1726882579.36433: done transferring module to remote 15621 1726882579.36443: _low_level_execute_command(): starting 15621 1726882579.36448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/ /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py && sleep 0' 15621 1726882579.36897: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.36901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.36903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.36906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882579.36913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.36961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.36964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.37056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.38889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.38933: stderr chunk (state=3): >>><<< 15621 1726882579.38936: stdout chunk (state=3): >>><<< 15621 1726882579.38951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.38954: _low_level_execute_command(): starting 15621 1726882579.38960: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/AnsiballZ_command.py && sleep 0' 15621 1726882579.39411: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.39415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882579.39417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.39419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.39476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.39481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.39570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.56553: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:19.559144", "end": "2024-09-20 21:36:19.562615", "delta": "0:00:00.003471", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882579.58003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.58030: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882579.58133: stderr chunk (state=3): >>><<< 15621 1726882579.58149: stdout chunk (state=3): >>><<< 15621 1726882579.58180: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:19.559144", "end": "2024-09-20 21:36:19.562615", "delta": "0:00:00.003471", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882579.58241: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882579.58326: _low_level_execute_command(): starting 15621 1726882579.58331: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882579.2005434-16005-47654547395594/ > /dev/null 2>&1 && sleep 0' 15621 1726882579.58963: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882579.58986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882579.59003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.59037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.59051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882579.59102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.59173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.59204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.59335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.61354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.61357: stdout chunk (state=3): >>><<< 15621 1726882579.61359: stderr chunk (state=3): >>><<< 15621 1726882579.61374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.61386: handler run complete 15621 1726882579.61427: Evaluated conditional (False): False 15621 1726882579.61432: attempt loop complete, returning result 15621 1726882579.61443: _execute() done 15621 1726882579.61527: dumping result to json 15621 1726882579.61530: done dumping result, returning 15621 1726882579.61532: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affc7ec-ae25-af1a-5b92-000000000115] 15621 1726882579.61535: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000115 15621 1726882579.61610: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000115 15621 1726882579.61614: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003471", "end": "2024-09-20 21:36:19.562615", "rc": 0, "start": "2024-09-20 21:36:19.559144" } STDOUT: bonding_masters eth0 lo 15621 1726882579.61698: no more pending results, returning what we have 15621 1726882579.61702: results queue empty 15621 1726882579.61703: checking for any_errors_fatal 15621 1726882579.61704: done checking for any_errors_fatal 15621 1726882579.61705: checking for max_fail_percentage 15621 1726882579.61706: done checking for max_fail_percentage 15621 1726882579.61707: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.61709: done checking to see if all hosts have failed 15621 1726882579.61709: getting the remaining hosts for this loop 15621 1726882579.61711: done getting the remaining hosts for this loop 15621 1726882579.61716: getting the next task for host managed_node3 15621 1726882579.61724: done getting next task for host managed_node3 15621 1726882579.61727: ^ task is: TASK: Set current_interfaces 15621 1726882579.61731: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.61735: getting variables 15621 1726882579.61737: in VariableManager get_vars() 15621 1726882579.61769: Calling all_inventory to load vars for managed_node3 15621 1726882579.61772: Calling groups_inventory to load vars for managed_node3 15621 1726882579.61776: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.61789: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.61792: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.61795: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.62378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.62592: done with get_vars() 15621 1726882579.62602: done getting variables 15621 1726882579.62669: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:19 -0400 (0:00:00.467) 0:00:11.706 ****** 15621 1726882579.62703: entering _queue_task() for managed_node3/set_fact 15621 1726882579.63065: worker is 1 (out of 1 available) 15621 1726882579.63079: exiting _queue_task() for managed_node3/set_fact 15621 1726882579.63091: done queuing things up, now waiting for results queue to drain 15621 1726882579.63092: waiting for pending results... 15621 1726882579.63370: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 15621 1726882579.63444: in run() - task 0affc7ec-ae25-af1a-5b92-000000000116 15621 1726882579.63449: variable 'ansible_search_path' from source: unknown 15621 1726882579.63451: variable 'ansible_search_path' from source: unknown 15621 1726882579.63497: calling self._execute() 15621 1726882579.63629: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.63634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.63637: variable 'omit' from source: magic vars 15621 1726882579.64047: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.64066: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.64078: variable 'omit' from source: magic vars 15621 1726882579.64141: variable 'omit' from source: magic vars 15621 1726882579.64340: variable '_current_interfaces' from source: set_fact 15621 1726882579.64344: variable 'omit' from source: magic vars 15621 1726882579.64384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882579.64430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882579.64464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882579.64488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.64504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.64541: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882579.64555: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.64570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.64691: Set connection var ansible_connection to ssh 15621 1726882579.64775: Set connection var ansible_shell_executable to /bin/sh 15621 1726882579.64778: Set connection var ansible_timeout to 10 15621 1726882579.64783: Set connection var ansible_shell_type to sh 15621 1726882579.64785: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882579.64787: Set connection var ansible_pipelining to False 15621 1726882579.64789: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.64792: variable 'ansible_connection' from source: unknown 15621 1726882579.64794: variable 'ansible_module_compression' from source: unknown 15621 1726882579.64796: variable 'ansible_shell_type' from source: unknown 15621 1726882579.64798: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.64800: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.64806: variable 'ansible_pipelining' from source: unknown 15621 1726882579.64885: variable 'ansible_timeout' from source: unknown 15621 1726882579.64889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.64999: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882579.65016: variable 'omit' from source: magic vars 15621 1726882579.65033: starting attempt loop 15621 1726882579.65041: running the handler 15621 1726882579.65057: handler run complete 15621 1726882579.65101: attempt loop complete, returning result 15621 1726882579.65105: _execute() done 15621 1726882579.65108: dumping result to json 15621 1726882579.65110: done dumping result, returning 15621 1726882579.65113: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affc7ec-ae25-af1a-5b92-000000000116] 15621 1726882579.65115: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000116 15621 1726882579.65339: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000116 15621 1726882579.65343: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15621 1726882579.65409: no more pending results, returning what we have 15621 1726882579.65413: results queue empty 15621 1726882579.65414: checking for any_errors_fatal 15621 1726882579.65420: done checking for any_errors_fatal 15621 1726882579.65423: checking for max_fail_percentage 15621 1726882579.65425: done checking for max_fail_percentage 15621 1726882579.65426: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.65428: done checking to see if all hosts have failed 15621 1726882579.65428: getting the remaining hosts for this loop 15621 1726882579.65430: done getting the remaining hosts for this loop 15621 1726882579.65435: getting the next task for host managed_node3 15621 1726882579.65443: done getting next task for host managed_node3 15621 1726882579.65446: ^ task is: TASK: Show current_interfaces 15621 1726882579.65449: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.65526: getting variables 15621 1726882579.65529: in VariableManager get_vars() 15621 1726882579.65556: Calling all_inventory to load vars for managed_node3 15621 1726882579.65559: Calling groups_inventory to load vars for managed_node3 15621 1726882579.65567: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.65578: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.65581: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.65585: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.65830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.66066: done with get_vars() 15621 1726882579.66076: done getting variables 15621 1726882579.66144: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:19 -0400 (0:00:00.034) 0:00:11.740 ****** 15621 1726882579.66174: entering _queue_task() for managed_node3/debug 15621 1726882579.66429: worker is 1 (out of 1 available) 15621 1726882579.66445: exiting _queue_task() for managed_node3/debug 15621 1726882579.66457: done queuing things up, now waiting for results queue to drain 15621 1726882579.66459: waiting for pending results... 15621 1726882579.66845: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 15621 1726882579.66851: in run() - task 0affc7ec-ae25-af1a-5b92-000000000105 15621 1726882579.66854: variable 'ansible_search_path' from source: unknown 15621 1726882579.66863: variable 'ansible_search_path' from source: unknown 15621 1726882579.66906: calling self._execute() 15621 1726882579.66990: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.67001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.67012: variable 'omit' from source: magic vars 15621 1726882579.67395: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.67412: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.67421: variable 'omit' from source: magic vars 15621 1726882579.67465: variable 'omit' from source: magic vars 15621 1726882579.67574: variable 'current_interfaces' from source: set_fact 15621 1726882579.67611: variable 'omit' from source: magic vars 15621 1726882579.67656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882579.67707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882579.67733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882579.67755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.67771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.67811: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882579.67821: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.67831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.67944: Set connection var ansible_connection to ssh 15621 1726882579.67959: Set connection var ansible_shell_executable to /bin/sh 15621 1726882579.67969: Set connection var ansible_timeout to 10 15621 1726882579.67976: Set connection var ansible_shell_type to sh 15621 1726882579.67985: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882579.68031: Set connection var ansible_pipelining to False 15621 1726882579.68034: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.68037: variable 'ansible_connection' from source: unknown 15621 1726882579.68039: variable 'ansible_module_compression' from source: unknown 15621 1726882579.68046: variable 'ansible_shell_type' from source: unknown 15621 1726882579.68054: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.68061: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.68068: variable 'ansible_pipelining' from source: unknown 15621 1726882579.68075: variable 'ansible_timeout' from source: unknown 15621 1726882579.68082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.68250: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882579.68254: variable 'omit' from source: magic vars 15621 1726882579.68261: starting attempt loop 15621 1726882579.68359: running the handler 15621 1726882579.68362: handler run complete 15621 1726882579.68364: attempt loop complete, returning result 15621 1726882579.68366: _execute() done 15621 1726882579.68369: dumping result to json 15621 1726882579.68371: done dumping result, returning 15621 1726882579.68373: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affc7ec-ae25-af1a-5b92-000000000105] 15621 1726882579.68376: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000105 15621 1726882579.68528: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000105 15621 1726882579.68532: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15621 1726882579.68588: no more pending results, returning what we have 15621 1726882579.68591: results queue empty 15621 1726882579.68592: checking for any_errors_fatal 15621 1726882579.68599: done checking for any_errors_fatal 15621 1726882579.68600: checking for max_fail_percentage 15621 1726882579.68602: done checking for max_fail_percentage 15621 1726882579.68602: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.68604: done checking to see if all hosts have failed 15621 1726882579.68605: getting the remaining hosts for this loop 15621 1726882579.68606: done getting the remaining hosts for this loop 15621 1726882579.68611: getting the next task for host managed_node3 15621 1726882579.68619: done getting next task for host managed_node3 15621 1726882579.68624: ^ task is: TASK: Include the task 'manage_test_interface.yml' 15621 1726882579.68626: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.68630: getting variables 15621 1726882579.68631: in VariableManager get_vars() 15621 1726882579.68660: Calling all_inventory to load vars for managed_node3 15621 1726882579.68663: Calling groups_inventory to load vars for managed_node3 15621 1726882579.68667: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.68845: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.68849: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.68854: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.69035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.69248: done with get_vars() 15621 1726882579.69257: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 21:36:19 -0400 (0:00:00.031) 0:00:11.772 ****** 15621 1726882579.69351: entering _queue_task() for managed_node3/include_tasks 15621 1726882579.69586: worker is 1 (out of 1 available) 15621 1726882579.69601: exiting _queue_task() for managed_node3/include_tasks 15621 1726882579.69613: done queuing things up, now waiting for results queue to drain 15621 1726882579.69615: waiting for pending results... 15621 1726882579.69953: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 15621 1726882579.69961: in run() - task 0affc7ec-ae25-af1a-5b92-000000000011 15621 1726882579.69980: variable 'ansible_search_path' from source: unknown 15621 1726882579.70021: calling self._execute() 15621 1726882579.70112: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.70126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.70140: variable 'omit' from source: magic vars 15621 1726882579.70527: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.70545: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.70557: _execute() done 15621 1726882579.70566: dumping result to json 15621 1726882579.70574: done dumping result, returning 15621 1726882579.70587: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0affc7ec-ae25-af1a-5b92-000000000011] 15621 1726882579.70605: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000011 15621 1726882579.70783: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000011 15621 1726882579.70786: WORKER PROCESS EXITING 15621 1726882579.70841: no more pending results, returning what we have 15621 1726882579.70845: in VariableManager get_vars() 15621 1726882579.70877: Calling all_inventory to load vars for managed_node3 15621 1726882579.70881: Calling groups_inventory to load vars for managed_node3 15621 1726882579.70886: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.70901: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.70903: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.70907: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.71213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.71406: done with get_vars() 15621 1726882579.71412: variable 'ansible_search_path' from source: unknown 15621 1726882579.71425: we have included files to process 15621 1726882579.71426: generating all_blocks data 15621 1726882579.71428: done generating all_blocks data 15621 1726882579.71432: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15621 1726882579.71437: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15621 1726882579.71440: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 15621 1726882579.72003: in VariableManager get_vars() 15621 1726882579.72020: done with get_vars() 15621 1726882579.72268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15621 1726882579.72943: done processing included file 15621 1726882579.72945: iterating over new_blocks loaded from include file 15621 1726882579.72947: in VariableManager get_vars() 15621 1726882579.72964: done with get_vars() 15621 1726882579.72966: filtering new block on tags 15621 1726882579.73000: done filtering new block on tags 15621 1726882579.73003: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 15621 1726882579.73008: extending task lists for all hosts with included blocks 15621 1726882579.73211: done extending task lists 15621 1726882579.73213: done processing included files 15621 1726882579.73213: results queue empty 15621 1726882579.73214: checking for any_errors_fatal 15621 1726882579.73217: done checking for any_errors_fatal 15621 1726882579.73218: checking for max_fail_percentage 15621 1726882579.73219: done checking for max_fail_percentage 15621 1726882579.73220: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.73221: done checking to see if all hosts have failed 15621 1726882579.73225: getting the remaining hosts for this loop 15621 1726882579.73226: done getting the remaining hosts for this loop 15621 1726882579.73229: getting the next task for host managed_node3 15621 1726882579.73233: done getting next task for host managed_node3 15621 1726882579.73235: ^ task is: TASK: Ensure state in ["present", "absent"] 15621 1726882579.73238: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.73240: getting variables 15621 1726882579.73241: in VariableManager get_vars() 15621 1726882579.73249: Calling all_inventory to load vars for managed_node3 15621 1726882579.73252: Calling groups_inventory to load vars for managed_node3 15621 1726882579.73254: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.73259: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.73262: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.73265: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.73414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.73608: done with get_vars() 15621 1726882579.73621: done getting variables 15621 1726882579.73688: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.043) 0:00:11.816 ****** 15621 1726882579.73713: entering _queue_task() for managed_node3/fail 15621 1726882579.73715: Creating lock for fail 15621 1726882579.74006: worker is 1 (out of 1 available) 15621 1726882579.74019: exiting _queue_task() for managed_node3/fail 15621 1726882579.74035: done queuing things up, now waiting for results queue to drain 15621 1726882579.74037: waiting for pending results... 15621 1726882579.74353: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 15621 1726882579.74406: in run() - task 0affc7ec-ae25-af1a-5b92-000000000131 15621 1726882579.74430: variable 'ansible_search_path' from source: unknown 15621 1726882579.74440: variable 'ansible_search_path' from source: unknown 15621 1726882579.74495: calling self._execute() 15621 1726882579.74582: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.74592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.74604: variable 'omit' from source: magic vars 15621 1726882579.75035: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.75054: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.75202: variable 'state' from source: include params 15621 1726882579.75331: Evaluated conditional (state not in ["present", "absent"]): False 15621 1726882579.75335: when evaluation is False, skipping this task 15621 1726882579.75338: _execute() done 15621 1726882579.75340: dumping result to json 15621 1726882579.75342: done dumping result, returning 15621 1726882579.75345: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0affc7ec-ae25-af1a-5b92-000000000131] 15621 1726882579.75347: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000131 15621 1726882579.75413: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000131 15621 1726882579.75417: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 15621 1726882579.75471: no more pending results, returning what we have 15621 1726882579.75475: results queue empty 15621 1726882579.75476: checking for any_errors_fatal 15621 1726882579.75478: done checking for any_errors_fatal 15621 1726882579.75479: checking for max_fail_percentage 15621 1726882579.75480: done checking for max_fail_percentage 15621 1726882579.75481: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.75483: done checking to see if all hosts have failed 15621 1726882579.75484: getting the remaining hosts for this loop 15621 1726882579.75485: done getting the remaining hosts for this loop 15621 1726882579.75489: getting the next task for host managed_node3 15621 1726882579.75495: done getting next task for host managed_node3 15621 1726882579.75498: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 15621 1726882579.75501: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.75505: getting variables 15621 1726882579.75507: in VariableManager get_vars() 15621 1726882579.75540: Calling all_inventory to load vars for managed_node3 15621 1726882579.75542: Calling groups_inventory to load vars for managed_node3 15621 1726882579.75546: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.75564: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.75566: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.75570: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.75963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.76177: done with get_vars() 15621 1726882579.76186: done getting variables 15621 1726882579.76247: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:36:19 -0400 (0:00:00.025) 0:00:11.841 ****** 15621 1726882579.76279: entering _queue_task() for managed_node3/fail 15621 1726882579.76633: worker is 1 (out of 1 available) 15621 1726882579.76643: exiting _queue_task() for managed_node3/fail 15621 1726882579.76653: done queuing things up, now waiting for results queue to drain 15621 1726882579.76655: waiting for pending results... 15621 1726882579.76944: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 15621 1726882579.76948: in run() - task 0affc7ec-ae25-af1a-5b92-000000000132 15621 1726882579.76951: variable 'ansible_search_path' from source: unknown 15621 1726882579.76954: variable 'ansible_search_path' from source: unknown 15621 1726882579.77028: calling self._execute() 15621 1726882579.77081: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.77092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.77106: variable 'omit' from source: magic vars 15621 1726882579.77497: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.77514: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.77674: variable 'type' from source: set_fact 15621 1726882579.77700: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 15621 1726882579.77703: when evaluation is False, skipping this task 15621 1726882579.77705: _execute() done 15621 1726882579.77728: dumping result to json 15621 1726882579.77731: done dumping result, returning 15621 1726882579.77734: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affc7ec-ae25-af1a-5b92-000000000132] 15621 1726882579.77798: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000132 15621 1726882579.77872: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000132 15621 1726882579.77876: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 15621 1726882579.77956: no more pending results, returning what we have 15621 1726882579.77960: results queue empty 15621 1726882579.77962: checking for any_errors_fatal 15621 1726882579.77968: done checking for any_errors_fatal 15621 1726882579.77969: checking for max_fail_percentage 15621 1726882579.77970: done checking for max_fail_percentage 15621 1726882579.77971: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.77973: done checking to see if all hosts have failed 15621 1726882579.77973: getting the remaining hosts for this loop 15621 1726882579.77975: done getting the remaining hosts for this loop 15621 1726882579.77979: getting the next task for host managed_node3 15621 1726882579.77985: done getting next task for host managed_node3 15621 1726882579.77988: ^ task is: TASK: Include the task 'show_interfaces.yml' 15621 1726882579.77992: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.77997: getting variables 15621 1726882579.77998: in VariableManager get_vars() 15621 1726882579.78033: Calling all_inventory to load vars for managed_node3 15621 1726882579.78036: Calling groups_inventory to load vars for managed_node3 15621 1726882579.78040: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.78056: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.78059: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.78062: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.78428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.78671: done with get_vars() 15621 1726882579.78680: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:36:19 -0400 (0:00:00.025) 0:00:11.867 ****** 15621 1726882579.78783: entering _queue_task() for managed_node3/include_tasks 15621 1726882579.79044: worker is 1 (out of 1 available) 15621 1726882579.79056: exiting _queue_task() for managed_node3/include_tasks 15621 1726882579.79068: done queuing things up, now waiting for results queue to drain 15621 1726882579.79070: waiting for pending results... 15621 1726882579.79333: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 15621 1726882579.79531: in run() - task 0affc7ec-ae25-af1a-5b92-000000000133 15621 1726882579.79540: variable 'ansible_search_path' from source: unknown 15621 1726882579.79544: variable 'ansible_search_path' from source: unknown 15621 1726882579.79546: calling self._execute() 15621 1726882579.79618: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.79637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.79662: variable 'omit' from source: magic vars 15621 1726882579.80091: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.80105: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.80201: _execute() done 15621 1726882579.80205: dumping result to json 15621 1726882579.80208: done dumping result, returning 15621 1726882579.80211: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-af1a-5b92-000000000133] 15621 1726882579.80213: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000133 15621 1726882579.80286: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000133 15621 1726882579.80290: WORKER PROCESS EXITING 15621 1726882579.80327: no more pending results, returning what we have 15621 1726882579.80334: in VariableManager get_vars() 15621 1726882579.80371: Calling all_inventory to load vars for managed_node3 15621 1726882579.80375: Calling groups_inventory to load vars for managed_node3 15621 1726882579.80379: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.80396: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.80399: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.80403: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.80780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.80983: done with get_vars() 15621 1726882579.80990: variable 'ansible_search_path' from source: unknown 15621 1726882579.80991: variable 'ansible_search_path' from source: unknown 15621 1726882579.81031: we have included files to process 15621 1726882579.81033: generating all_blocks data 15621 1726882579.81034: done generating all_blocks data 15621 1726882579.81038: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.81039: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.81042: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15621 1726882579.81152: in VariableManager get_vars() 15621 1726882579.81170: done with get_vars() 15621 1726882579.81302: done processing included file 15621 1726882579.81304: iterating over new_blocks loaded from include file 15621 1726882579.81306: in VariableManager get_vars() 15621 1726882579.81319: done with get_vars() 15621 1726882579.81321: filtering new block on tags 15621 1726882579.81342: done filtering new block on tags 15621 1726882579.81345: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 15621 1726882579.81350: extending task lists for all hosts with included blocks 15621 1726882579.81843: done extending task lists 15621 1726882579.81844: done processing included files 15621 1726882579.81845: results queue empty 15621 1726882579.81846: checking for any_errors_fatal 15621 1726882579.81849: done checking for any_errors_fatal 15621 1726882579.81850: checking for max_fail_percentage 15621 1726882579.81851: done checking for max_fail_percentage 15621 1726882579.81852: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.81853: done checking to see if all hosts have failed 15621 1726882579.81853: getting the remaining hosts for this loop 15621 1726882579.81855: done getting the remaining hosts for this loop 15621 1726882579.81857: getting the next task for host managed_node3 15621 1726882579.81862: done getting next task for host managed_node3 15621 1726882579.81864: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15621 1726882579.81867: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.81870: getting variables 15621 1726882579.81870: in VariableManager get_vars() 15621 1726882579.81878: Calling all_inventory to load vars for managed_node3 15621 1726882579.81881: Calling groups_inventory to load vars for managed_node3 15621 1726882579.81883: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.81888: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.81891: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.81894: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.82073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.82278: done with get_vars() 15621 1726882579.82286: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.035) 0:00:11.902 ****** 15621 1726882579.82362: entering _queue_task() for managed_node3/include_tasks 15621 1726882579.82727: worker is 1 (out of 1 available) 15621 1726882579.82738: exiting _queue_task() for managed_node3/include_tasks 15621 1726882579.82750: done queuing things up, now waiting for results queue to drain 15621 1726882579.82752: waiting for pending results... 15621 1726882579.83043: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 15621 1726882579.83049: in run() - task 0affc7ec-ae25-af1a-5b92-00000000015c 15621 1726882579.83065: variable 'ansible_search_path' from source: unknown 15621 1726882579.83072: variable 'ansible_search_path' from source: unknown 15621 1726882579.83118: calling self._execute() 15621 1726882579.83209: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.83225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.83258: variable 'omit' from source: magic vars 15621 1726882579.83646: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.83694: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.83697: _execute() done 15621 1726882579.83700: dumping result to json 15621 1726882579.83703: done dumping result, returning 15621 1726882579.83705: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-af1a-5b92-00000000015c] 15621 1726882579.83709: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000015c 15621 1726882579.83924: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000015c 15621 1726882579.83928: WORKER PROCESS EXITING 15621 1726882579.83955: no more pending results, returning what we have 15621 1726882579.83959: in VariableManager get_vars() 15621 1726882579.83992: Calling all_inventory to load vars for managed_node3 15621 1726882579.83996: Calling groups_inventory to load vars for managed_node3 15621 1726882579.84000: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.84013: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.84083: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.84088: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.84228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.84367: done with get_vars() 15621 1726882579.84374: variable 'ansible_search_path' from source: unknown 15621 1726882579.84375: variable 'ansible_search_path' from source: unknown 15621 1726882579.84415: we have included files to process 15621 1726882579.84416: generating all_blocks data 15621 1726882579.84417: done generating all_blocks data 15621 1726882579.84417: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.84418: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.84420: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15621 1726882579.84602: done processing included file 15621 1726882579.84604: iterating over new_blocks loaded from include file 15621 1726882579.84605: in VariableManager get_vars() 15621 1726882579.84615: done with get_vars() 15621 1726882579.84616: filtering new block on tags 15621 1726882579.84629: done filtering new block on tags 15621 1726882579.84631: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 15621 1726882579.84634: extending task lists for all hosts with included blocks 15621 1726882579.84734: done extending task lists 15621 1726882579.84735: done processing included files 15621 1726882579.84736: results queue empty 15621 1726882579.84736: checking for any_errors_fatal 15621 1726882579.84738: done checking for any_errors_fatal 15621 1726882579.84738: checking for max_fail_percentage 15621 1726882579.84739: done checking for max_fail_percentage 15621 1726882579.84739: checking to see if all hosts have failed and the running result is not ok 15621 1726882579.84740: done checking to see if all hosts have failed 15621 1726882579.84741: getting the remaining hosts for this loop 15621 1726882579.84741: done getting the remaining hosts for this loop 15621 1726882579.84743: getting the next task for host managed_node3 15621 1726882579.84746: done getting next task for host managed_node3 15621 1726882579.84747: ^ task is: TASK: Gather current interface info 15621 1726882579.84749: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882579.84751: getting variables 15621 1726882579.84751: in VariableManager get_vars() 15621 1726882579.84757: Calling all_inventory to load vars for managed_node3 15621 1726882579.84758: Calling groups_inventory to load vars for managed_node3 15621 1726882579.84759: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882579.84763: Calling all_plugins_play to load vars for managed_node3 15621 1726882579.84764: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882579.84766: Calling groups_plugins_play to load vars for managed_node3 15621 1726882579.84854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882579.84968: done with get_vars() 15621 1726882579.84977: done getting variables 15621 1726882579.85006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.026) 0:00:11.929 ****** 15621 1726882579.85028: entering _queue_task() for managed_node3/command 15621 1726882579.85215: worker is 1 (out of 1 available) 15621 1726882579.85231: exiting _queue_task() for managed_node3/command 15621 1726882579.85243: done queuing things up, now waiting for results queue to drain 15621 1726882579.85245: waiting for pending results... 15621 1726882579.85387: running TaskExecutor() for managed_node3/TASK: Gather current interface info 15621 1726882579.85457: in run() - task 0affc7ec-ae25-af1a-5b92-000000000193 15621 1726882579.85468: variable 'ansible_search_path' from source: unknown 15621 1726882579.85471: variable 'ansible_search_path' from source: unknown 15621 1726882579.85508: calling self._execute() 15621 1726882579.85568: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.85575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.85584: variable 'omit' from source: magic vars 15621 1726882579.85924: variable 'ansible_distribution_major_version' from source: facts 15621 1726882579.85935: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882579.85944: variable 'omit' from source: magic vars 15621 1726882579.86007: variable 'omit' from source: magic vars 15621 1726882579.86200: variable 'omit' from source: magic vars 15621 1726882579.86204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882579.86207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882579.86210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882579.86212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.86215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882579.86241: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882579.86249: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.86257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.86369: Set connection var ansible_connection to ssh 15621 1726882579.86394: Set connection var ansible_shell_executable to /bin/sh 15621 1726882579.86405: Set connection var ansible_timeout to 10 15621 1726882579.86412: Set connection var ansible_shell_type to sh 15621 1726882579.86424: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882579.86436: Set connection var ansible_pipelining to False 15621 1726882579.86464: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.86472: variable 'ansible_connection' from source: unknown 15621 1726882579.86481: variable 'ansible_module_compression' from source: unknown 15621 1726882579.86497: variable 'ansible_shell_type' from source: unknown 15621 1726882579.86505: variable 'ansible_shell_executable' from source: unknown 15621 1726882579.86512: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882579.86519: variable 'ansible_pipelining' from source: unknown 15621 1726882579.86529: variable 'ansible_timeout' from source: unknown 15621 1726882579.86537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882579.86682: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882579.86827: variable 'omit' from source: magic vars 15621 1726882579.86831: starting attempt loop 15621 1726882579.86834: running the handler 15621 1726882579.86837: _low_level_execute_command(): starting 15621 1726882579.86839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882579.87367: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.87391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.87448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.87461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.87552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.89315: stdout chunk (state=3): >>>/root <<< 15621 1726882579.89487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.89490: stdout chunk (state=3): >>><<< 15621 1726882579.89492: stderr chunk (state=3): >>><<< 15621 1726882579.89604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.89608: _low_level_execute_command(): starting 15621 1726882579.89611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988 `" && echo ansible-tmp-1726882579.8951614-16034-68834831836988="` echo /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988 `" ) && sleep 0' 15621 1726882579.90240: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.90306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.90328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.90350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.90481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.92455: stdout chunk (state=3): >>>ansible-tmp-1726882579.8951614-16034-68834831836988=/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988 <<< 15621 1726882579.92580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.92645: stderr chunk (state=3): >>><<< 15621 1726882579.92648: stdout chunk (state=3): >>><<< 15621 1726882579.92663: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882579.8951614-16034-68834831836988=/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.92828: variable 'ansible_module_compression' from source: unknown 15621 1726882579.92832: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882579.92836: variable 'ansible_facts' from source: unknown 15621 1726882579.92897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py 15621 1726882579.93041: Sending initial data 15621 1726882579.93045: Sent initial data (155 bytes) 15621 1726882579.93737: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.93783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882579.93788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.93791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.93898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.95508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15621 1726882579.95512: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882579.95600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882579.95685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsy832xel /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py <<< 15621 1726882579.95687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py" <<< 15621 1726882579.95769: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsy832xel" to remote "/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py" <<< 15621 1726882579.95774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py" <<< 15621 1726882579.96720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.96765: stderr chunk (state=3): >>><<< 15621 1726882579.96768: stdout chunk (state=3): >>><<< 15621 1726882579.96790: done transferring module to remote 15621 1726882579.96793: _low_level_execute_command(): starting 15621 1726882579.96796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/ /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py && sleep 0' 15621 1726882579.97224: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.97228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882579.97231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882579.97233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882579.97236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.97293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.97297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.97372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882579.99193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882579.99239: stderr chunk (state=3): >>><<< 15621 1726882579.99242: stdout chunk (state=3): >>><<< 15621 1726882579.99254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882579.99257: _low_level_execute_command(): starting 15621 1726882579.99262: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/AnsiballZ_command.py && sleep 0' 15621 1726882579.99676: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.99679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.99682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882579.99684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882579.99734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882579.99738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882579.99831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.16457: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:20.159522", "end": "2024-09-20 21:36:20.162794", "delta": "0:00:00.003272", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882580.18159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882580.18163: stdout chunk (state=3): >>><<< 15621 1726882580.18165: stderr chunk (state=3): >>><<< 15621 1726882580.18185: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:20.159522", "end": "2024-09-20 21:36:20.162794", "delta": "0:00:00.003272", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882580.18309: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882580.18313: _low_level_execute_command(): starting 15621 1726882580.18316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882579.8951614-16034-68834831836988/ > /dev/null 2>&1 && sleep 0' 15621 1726882580.18843: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882580.18856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882580.18870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882580.18891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882580.18989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882580.19012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.19026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.19139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.21079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882580.21082: stdout chunk (state=3): >>><<< 15621 1726882580.21089: stderr chunk (state=3): >>><<< 15621 1726882580.21243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882580.21250: handler run complete 15621 1726882580.21276: Evaluated conditional (False): False 15621 1726882580.21312: attempt loop complete, returning result 15621 1726882580.21315: _execute() done 15621 1726882580.21318: dumping result to json 15621 1726882580.21320: done dumping result, returning 15621 1726882580.21323: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affc7ec-ae25-af1a-5b92-000000000193] 15621 1726882580.21326: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000193 15621 1726882580.21539: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000193 15621 1726882580.21542: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003272", "end": "2024-09-20 21:36:20.162794", "rc": 0, "start": "2024-09-20 21:36:20.159522" } STDOUT: bonding_masters eth0 lo 15621 1726882580.21629: no more pending results, returning what we have 15621 1726882580.21632: results queue empty 15621 1726882580.21633: checking for any_errors_fatal 15621 1726882580.21635: done checking for any_errors_fatal 15621 1726882580.21636: checking for max_fail_percentage 15621 1726882580.21638: done checking for max_fail_percentage 15621 1726882580.21638: checking to see if all hosts have failed and the running result is not ok 15621 1726882580.21639: done checking to see if all hosts have failed 15621 1726882580.21640: getting the remaining hosts for this loop 15621 1726882580.21641: done getting the remaining hosts for this loop 15621 1726882580.21646: getting the next task for host managed_node3 15621 1726882580.21653: done getting next task for host managed_node3 15621 1726882580.21656: ^ task is: TASK: Set current_interfaces 15621 1726882580.21660: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882580.21663: getting variables 15621 1726882580.21665: in VariableManager get_vars() 15621 1726882580.21865: Calling all_inventory to load vars for managed_node3 15621 1726882580.21868: Calling groups_inventory to load vars for managed_node3 15621 1726882580.21874: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882580.21887: Calling all_plugins_play to load vars for managed_node3 15621 1726882580.21890: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882580.21893: Calling groups_plugins_play to load vars for managed_node3 15621 1726882580.22282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882580.22995: done with get_vars() 15621 1726882580.23006: done getting variables 15621 1726882580.23068: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:20 -0400 (0:00:00.380) 0:00:12.310 ****** 15621 1726882580.23103: entering _queue_task() for managed_node3/set_fact 15621 1726882580.24151: worker is 1 (out of 1 available) 15621 1726882580.24162: exiting _queue_task() for managed_node3/set_fact 15621 1726882580.24175: done queuing things up, now waiting for results queue to drain 15621 1726882580.24176: waiting for pending results... 15621 1726882580.24551: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 15621 1726882580.24590: in run() - task 0affc7ec-ae25-af1a-5b92-000000000194 15621 1726882580.24606: variable 'ansible_search_path' from source: unknown 15621 1726882580.24610: variable 'ansible_search_path' from source: unknown 15621 1726882580.24866: calling self._execute() 15621 1726882580.24928: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.24976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.24980: variable 'omit' from source: magic vars 15621 1726882580.25716: variable 'ansible_distribution_major_version' from source: facts 15621 1726882580.25732: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882580.25742: variable 'omit' from source: magic vars 15621 1726882580.25800: variable 'omit' from source: magic vars 15621 1726882580.26068: variable '_current_interfaces' from source: set_fact 15621 1726882580.26077: variable 'omit' from source: magic vars 15621 1726882580.26118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882580.26460: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882580.26478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882580.26500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.26508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.26542: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882580.26545: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.26547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.26929: Set connection var ansible_connection to ssh 15621 1726882580.26932: Set connection var ansible_shell_executable to /bin/sh 15621 1726882580.26935: Set connection var ansible_timeout to 10 15621 1726882580.26938: Set connection var ansible_shell_type to sh 15621 1726882580.26940: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882580.26942: Set connection var ansible_pipelining to False 15621 1726882580.26945: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.26948: variable 'ansible_connection' from source: unknown 15621 1726882580.26950: variable 'ansible_module_compression' from source: unknown 15621 1726882580.26952: variable 'ansible_shell_type' from source: unknown 15621 1726882580.26955: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.26957: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.26959: variable 'ansible_pipelining' from source: unknown 15621 1726882580.26961: variable 'ansible_timeout' from source: unknown 15621 1726882580.26963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.27283: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882580.27294: variable 'omit' from source: magic vars 15621 1726882580.27301: starting attempt loop 15621 1726882580.27304: running the handler 15621 1726882580.27315: handler run complete 15621 1726882580.27363: attempt loop complete, returning result 15621 1726882580.27366: _execute() done 15621 1726882580.27369: dumping result to json 15621 1726882580.27375: done dumping result, returning 15621 1726882580.27378: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affc7ec-ae25-af1a-5b92-000000000194] 15621 1726882580.27380: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000194 15621 1726882580.27448: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000194 15621 1726882580.27451: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15621 1726882580.27535: no more pending results, returning what we have 15621 1726882580.27538: results queue empty 15621 1726882580.27539: checking for any_errors_fatal 15621 1726882580.27546: done checking for any_errors_fatal 15621 1726882580.27547: checking for max_fail_percentage 15621 1726882580.27548: done checking for max_fail_percentage 15621 1726882580.27549: checking to see if all hosts have failed and the running result is not ok 15621 1726882580.27551: done checking to see if all hosts have failed 15621 1726882580.27551: getting the remaining hosts for this loop 15621 1726882580.27553: done getting the remaining hosts for this loop 15621 1726882580.27557: getting the next task for host managed_node3 15621 1726882580.27565: done getting next task for host managed_node3 15621 1726882580.27568: ^ task is: TASK: Show current_interfaces 15621 1726882580.27573: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882580.27577: getting variables 15621 1726882580.27579: in VariableManager get_vars() 15621 1726882580.27609: Calling all_inventory to load vars for managed_node3 15621 1726882580.27612: Calling groups_inventory to load vars for managed_node3 15621 1726882580.27615: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882580.27930: Calling all_plugins_play to load vars for managed_node3 15621 1726882580.27934: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882580.27938: Calling groups_plugins_play to load vars for managed_node3 15621 1726882580.28115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882580.28746: done with get_vars() 15621 1726882580.28758: done getting variables 15621 1726882580.28829: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:20 -0400 (0:00:00.057) 0:00:12.367 ****** 15621 1726882580.28865: entering _queue_task() for managed_node3/debug 15621 1726882580.29561: worker is 1 (out of 1 available) 15621 1726882580.29576: exiting _queue_task() for managed_node3/debug 15621 1726882580.29588: done queuing things up, now waiting for results queue to drain 15621 1726882580.29590: waiting for pending results... 15621 1726882580.30141: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 15621 1726882580.30146: in run() - task 0affc7ec-ae25-af1a-5b92-00000000015d 15621 1726882580.30230: variable 'ansible_search_path' from source: unknown 15621 1726882580.30234: variable 'ansible_search_path' from source: unknown 15621 1726882580.30249: calling self._execute() 15621 1726882580.30348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.30640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.30750: variable 'omit' from source: magic vars 15621 1726882580.31305: variable 'ansible_distribution_major_version' from source: facts 15621 1726882580.31328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882580.31341: variable 'omit' from source: magic vars 15621 1726882580.31451: variable 'omit' from source: magic vars 15621 1726882580.31672: variable 'current_interfaces' from source: set_fact 15621 1726882580.31707: variable 'omit' from source: magic vars 15621 1726882580.31881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882580.31930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882580.31962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882580.32051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.32072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.32110: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882580.32527: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.32530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.33027: Set connection var ansible_connection to ssh 15621 1726882580.33030: Set connection var ansible_shell_executable to /bin/sh 15621 1726882580.33033: Set connection var ansible_timeout to 10 15621 1726882580.33035: Set connection var ansible_shell_type to sh 15621 1726882580.33037: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882580.33040: Set connection var ansible_pipelining to False 15621 1726882580.33042: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.33044: variable 'ansible_connection' from source: unknown 15621 1726882580.33046: variable 'ansible_module_compression' from source: unknown 15621 1726882580.33049: variable 'ansible_shell_type' from source: unknown 15621 1726882580.33051: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.33053: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.33056: variable 'ansible_pipelining' from source: unknown 15621 1726882580.33058: variable 'ansible_timeout' from source: unknown 15621 1726882580.33060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.33476: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882580.33494: variable 'omit' from source: magic vars 15621 1726882580.33503: starting attempt loop 15621 1726882580.33510: running the handler 15621 1726882580.33572: handler run complete 15621 1726882580.33593: attempt loop complete, returning result 15621 1726882580.33600: _execute() done 15621 1726882580.33609: dumping result to json 15621 1726882580.33626: done dumping result, returning 15621 1726882580.33640: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affc7ec-ae25-af1a-5b92-00000000015d] 15621 1726882580.33650: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000015d ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15621 1726882580.33812: no more pending results, returning what we have 15621 1726882580.33816: results queue empty 15621 1726882580.33817: checking for any_errors_fatal 15621 1726882580.33824: done checking for any_errors_fatal 15621 1726882580.33825: checking for max_fail_percentage 15621 1726882580.33827: done checking for max_fail_percentage 15621 1726882580.33828: checking to see if all hosts have failed and the running result is not ok 15621 1726882580.33830: done checking to see if all hosts have failed 15621 1726882580.33830: getting the remaining hosts for this loop 15621 1726882580.33832: done getting the remaining hosts for this loop 15621 1726882580.33837: getting the next task for host managed_node3 15621 1726882580.33845: done getting next task for host managed_node3 15621 1726882580.33849: ^ task is: TASK: Install iproute 15621 1726882580.33854: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882580.33859: getting variables 15621 1726882580.33861: in VariableManager get_vars() 15621 1726882580.33896: Calling all_inventory to load vars for managed_node3 15621 1726882580.33899: Calling groups_inventory to load vars for managed_node3 15621 1726882580.33904: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882580.33917: Calling all_plugins_play to load vars for managed_node3 15621 1726882580.33920: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882580.33928: Calling groups_plugins_play to load vars for managed_node3 15621 1726882580.34389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882580.34660: done with get_vars() 15621 1726882580.34675: done getting variables 15621 1726882580.34711: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000015d 15621 1726882580.34714: WORKER PROCESS EXITING 15621 1726882580.34752: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:36:20 -0400 (0:00:00.059) 0:00:12.427 ****** 15621 1726882580.34802: entering _queue_task() for managed_node3/package 15621 1726882580.35104: worker is 1 (out of 1 available) 15621 1726882580.35120: exiting _queue_task() for managed_node3/package 15621 1726882580.35306: done queuing things up, now waiting for results queue to drain 15621 1726882580.35308: waiting for pending results... 15621 1726882580.35866: running TaskExecutor() for managed_node3/TASK: Install iproute 15621 1726882580.36153: in run() - task 0affc7ec-ae25-af1a-5b92-000000000134 15621 1726882580.36177: variable 'ansible_search_path' from source: unknown 15621 1726882580.36627: variable 'ansible_search_path' from source: unknown 15621 1726882580.36631: calling self._execute() 15621 1726882580.36634: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.36637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.36640: variable 'omit' from source: magic vars 15621 1726882580.37440: variable 'ansible_distribution_major_version' from source: facts 15621 1726882580.37463: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882580.37476: variable 'omit' from source: magic vars 15621 1726882580.37527: variable 'omit' from source: magic vars 15621 1726882580.38000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882580.43217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882580.43300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882580.43358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882580.43399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882580.43429: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882580.43657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882580.43661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882580.43664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882580.43667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882580.43716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882580.43827: variable '__network_is_ostree' from source: set_fact 15621 1726882580.44027: variable 'omit' from source: magic vars 15621 1726882580.44031: variable 'omit' from source: magic vars 15621 1726882580.44034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882580.44036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882580.44039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882580.44041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.44045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882580.44101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882580.44104: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.44107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.44289: Set connection var ansible_connection to ssh 15621 1726882580.44298: Set connection var ansible_shell_executable to /bin/sh 15621 1726882580.44304: Set connection var ansible_timeout to 10 15621 1726882580.44307: Set connection var ansible_shell_type to sh 15621 1726882580.44313: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882580.44319: Set connection var ansible_pipelining to False 15621 1726882580.44653: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.44656: variable 'ansible_connection' from source: unknown 15621 1726882580.44659: variable 'ansible_module_compression' from source: unknown 15621 1726882580.44661: variable 'ansible_shell_type' from source: unknown 15621 1726882580.44664: variable 'ansible_shell_executable' from source: unknown 15621 1726882580.44667: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882580.44669: variable 'ansible_pipelining' from source: unknown 15621 1726882580.44676: variable 'ansible_timeout' from source: unknown 15621 1726882580.44681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882580.44793: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882580.44803: variable 'omit' from source: magic vars 15621 1726882580.44808: starting attempt loop 15621 1726882580.44811: running the handler 15621 1726882580.44818: variable 'ansible_facts' from source: unknown 15621 1726882580.44821: variable 'ansible_facts' from source: unknown 15621 1726882580.44865: _low_level_execute_command(): starting 15621 1726882580.44872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882580.46208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.46213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882580.46216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.46245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.46370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.48146: stdout chunk (state=3): >>>/root <<< 15621 1726882580.48318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882580.48335: stdout chunk (state=3): >>><<< 15621 1726882580.48358: stderr chunk (state=3): >>><<< 15621 1726882580.48386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882580.48415: _low_level_execute_command(): starting 15621 1726882580.48452: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663 `" && echo ansible-tmp-1726882580.4840088-16054-220371939333663="` echo /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663 `" ) && sleep 0' 15621 1726882580.49256: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.49260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882580.49263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.49276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.49454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.51452: stdout chunk (state=3): >>>ansible-tmp-1726882580.4840088-16054-220371939333663=/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663 <<< 15621 1726882580.51626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882580.51635: stdout chunk (state=3): >>><<< 15621 1726882580.51645: stderr chunk (state=3): >>><<< 15621 1726882580.51668: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882580.4840088-16054-220371939333663=/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882580.51708: variable 'ansible_module_compression' from source: unknown 15621 1726882580.51776: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 15621 1726882580.51783: ANSIBALLZ: Acquiring lock 15621 1726882580.51789: ANSIBALLZ: Lock acquired: 140146888266560 15621 1726882580.51796: ANSIBALLZ: Creating module 15621 1726882580.65531: ANSIBALLZ: Writing module into payload 15621 1726882580.65677: ANSIBALLZ: Writing module 15621 1726882580.65697: ANSIBALLZ: Renaming module 15621 1726882580.65703: ANSIBALLZ: Done creating module 15621 1726882580.65721: variable 'ansible_facts' from source: unknown 15621 1726882580.65784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py 15621 1726882580.65891: Sending initial data 15621 1726882580.65895: Sent initial data (152 bytes) 15621 1726882580.66370: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882580.66374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.66377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882580.66380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882580.66382: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.66429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882580.66434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.66450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.66537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.68244: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15621 1726882580.68251: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882580.68336: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882580.68435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp_5tao3vz /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py <<< 15621 1726882580.68439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py" <<< 15621 1726882580.68514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp_5tao3vz" to remote "/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py" <<< 15621 1726882580.69703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882580.69790: stderr chunk (state=3): >>><<< 15621 1726882580.69793: stdout chunk (state=3): >>><<< 15621 1726882580.69812: done transferring module to remote 15621 1726882580.69826: _low_level_execute_command(): starting 15621 1726882580.69831: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/ /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py && sleep 0' 15621 1726882580.70271: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882580.70274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882580.70277: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.70280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882580.70282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.70328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.70343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.70426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882580.75310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882580.75353: stderr chunk (state=3): >>><<< 15621 1726882580.75357: stdout chunk (state=3): >>><<< 15621 1726882580.75372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882580.75375: _low_level_execute_command(): starting 15621 1726882580.75378: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/AnsiballZ_dnf.py && sleep 0' 15621 1726882580.75783: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882580.75801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882580.75805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882580.75808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.75820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882580.75876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882580.75880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882580.75973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882581.80632: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 15621 1726882581.84716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882581.84755: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882581.85170: stderr chunk (state=3): >>><<< 15621 1726882581.85174: stdout chunk (state=3): >>><<< 15621 1726882581.85181: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882581.85203: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882581.85223: _low_level_execute_command(): starting 15621 1726882581.85236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882580.4840088-16054-220371939333663/ > /dev/null 2>&1 && sleep 0' 15621 1726882581.86647: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882581.86667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882581.86684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882581.86703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882581.86843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882581.86856: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882581.87072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882581.87156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882581.89157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882581.89429: stderr chunk (state=3): >>><<< 15621 1726882581.89434: stdout chunk (state=3): >>><<< 15621 1726882581.89438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882581.89441: handler run complete 15621 1726882581.89931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882581.90228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882581.90297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882581.90404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882581.90461: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882581.90650: variable '__install_status' from source: unknown 15621 1726882581.90719: Evaluated conditional (__install_status is success): True 15621 1726882581.90749: attempt loop complete, returning result 15621 1726882581.90810: _execute() done 15621 1726882581.90818: dumping result to json 15621 1726882581.90832: done dumping result, returning 15621 1726882581.90845: done running TaskExecutor() for managed_node3/TASK: Install iproute [0affc7ec-ae25-af1a-5b92-000000000134] 15621 1726882581.90856: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000134 15621 1726882581.91038: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000134 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 15621 1726882581.91213: no more pending results, returning what we have 15621 1726882581.91217: results queue empty 15621 1726882581.91218: checking for any_errors_fatal 15621 1726882581.91227: done checking for any_errors_fatal 15621 1726882581.91228: checking for max_fail_percentage 15621 1726882581.91229: done checking for max_fail_percentage 15621 1726882581.91230: checking to see if all hosts have failed and the running result is not ok 15621 1726882581.91233: done checking to see if all hosts have failed 15621 1726882581.91234: getting the remaining hosts for this loop 15621 1726882581.91236: done getting the remaining hosts for this loop 15621 1726882581.91241: getting the next task for host managed_node3 15621 1726882581.91247: done getting next task for host managed_node3 15621 1726882581.91251: ^ task is: TASK: Create veth interface {{ interface }} 15621 1726882581.91254: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882581.91258: getting variables 15621 1726882581.91259: in VariableManager get_vars() 15621 1726882581.91291: Calling all_inventory to load vars for managed_node3 15621 1726882581.91294: Calling groups_inventory to load vars for managed_node3 15621 1726882581.91297: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882581.91311: Calling all_plugins_play to load vars for managed_node3 15621 1726882581.91314: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882581.91319: Calling groups_plugins_play to load vars for managed_node3 15621 1726882581.92140: WORKER PROCESS EXITING 15621 1726882581.92157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882581.92381: done with get_vars() 15621 1726882581.92392: done getting variables 15621 1726882581.92593: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882581.92765: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:36:21 -0400 (0:00:01.581) 0:00:14.008 ****** 15621 1726882581.92956: entering _queue_task() for managed_node3/command 15621 1726882581.93928: worker is 1 (out of 1 available) 15621 1726882581.93940: exiting _queue_task() for managed_node3/command 15621 1726882581.93952: done queuing things up, now waiting for results queue to drain 15621 1726882581.93954: waiting for pending results... 15621 1726882581.94135: running TaskExecutor() for managed_node3/TASK: Create veth interface lsr27 15621 1726882581.94510: in run() - task 0affc7ec-ae25-af1a-5b92-000000000135 15621 1726882581.94514: variable 'ansible_search_path' from source: unknown 15621 1726882581.94517: variable 'ansible_search_path' from source: unknown 15621 1726882581.95114: variable 'interface' from source: set_fact 15621 1726882581.95289: variable 'interface' from source: set_fact 15621 1726882581.95476: variable 'interface' from source: set_fact 15621 1726882581.95831: Loaded config def from plugin (lookup/items) 15621 1726882581.95985: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 15621 1726882581.95989: variable 'omit' from source: magic vars 15621 1726882581.96256: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882581.96265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882581.96282: variable 'omit' from source: magic vars 15621 1726882581.96852: variable 'ansible_distribution_major_version' from source: facts 15621 1726882581.96916: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882581.97449: variable 'type' from source: set_fact 15621 1726882581.97453: variable 'state' from source: include params 15621 1726882581.97455: variable 'interface' from source: set_fact 15621 1726882581.97458: variable 'current_interfaces' from source: set_fact 15621 1726882581.97467: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15621 1726882581.97524: variable 'omit' from source: magic vars 15621 1726882581.97663: variable 'omit' from source: magic vars 15621 1726882581.97780: variable 'item' from source: unknown 15621 1726882581.97914: variable 'item' from source: unknown 15621 1726882581.97942: variable 'omit' from source: magic vars 15621 1726882581.98058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882581.98088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882581.98128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882581.98188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882581.98231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882581.98309: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882581.98430: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882581.98434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882581.98629: Set connection var ansible_connection to ssh 15621 1726882581.98633: Set connection var ansible_shell_executable to /bin/sh 15621 1726882581.98636: Set connection var ansible_timeout to 10 15621 1726882581.98638: Set connection var ansible_shell_type to sh 15621 1726882581.98641: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882581.98644: Set connection var ansible_pipelining to False 15621 1726882581.98868: variable 'ansible_shell_executable' from source: unknown 15621 1726882581.98872: variable 'ansible_connection' from source: unknown 15621 1726882581.98875: variable 'ansible_module_compression' from source: unknown 15621 1726882581.98878: variable 'ansible_shell_type' from source: unknown 15621 1726882581.98880: variable 'ansible_shell_executable' from source: unknown 15621 1726882581.98882: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882581.98886: variable 'ansible_pipelining' from source: unknown 15621 1726882581.98888: variable 'ansible_timeout' from source: unknown 15621 1726882581.98890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882581.99436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882581.99440: variable 'omit' from source: magic vars 15621 1726882581.99442: starting attempt loop 15621 1726882581.99444: running the handler 15621 1726882581.99446: _low_level_execute_command(): starting 15621 1726882581.99448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882582.00910: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.00930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.00968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882582.01338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.01403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.01548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.01654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.03648: stdout chunk (state=3): >>>/root <<< 15621 1726882582.03651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.03911: stderr chunk (state=3): >>><<< 15621 1726882582.03920: stdout chunk (state=3): >>><<< 15621 1726882582.04040: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.04062: _low_level_execute_command(): starting 15621 1726882582.04084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100 `" && echo ansible-tmp-1726882582.0404718-16140-625437154100="` echo /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100 `" ) && sleep 0' 15621 1726882582.05441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.05556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.05596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.05600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.06055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.08020: stdout chunk (state=3): >>>ansible-tmp-1726882582.0404718-16140-625437154100=/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100 <<< 15621 1726882582.08134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.08390: stderr chunk (state=3): >>><<< 15621 1726882582.08394: stdout chunk (state=3): >>><<< 15621 1726882582.08397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.0404718-16140-625437154100=/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.08717: variable 'ansible_module_compression' from source: unknown 15621 1726882582.08720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882582.08725: variable 'ansible_facts' from source: unknown 15621 1726882582.08803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py 15621 1726882582.09264: Sending initial data 15621 1726882582.09273: Sent initial data (153 bytes) 15621 1726882582.10468: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882582.10485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882582.10538: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.10726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.10742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.10854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.12541: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882582.12749: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882582.12948: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpz1dkqmt6 /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py <<< 15621 1726882582.12952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpz1dkqmt6" to remote "/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py" <<< 15621 1726882582.14965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.15084: stderr chunk (state=3): >>><<< 15621 1726882582.15093: stdout chunk (state=3): >>><<< 15621 1726882582.15124: done transferring module to remote 15621 1726882582.15318: _low_level_execute_command(): starting 15621 1726882582.15324: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/ /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py && sleep 0' 15621 1726882582.16460: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.16538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.16693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.16697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.16699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.17136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.19167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.19170: stdout chunk (state=3): >>><<< 15621 1726882582.19173: stderr chunk (state=3): >>><<< 15621 1726882582.19176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.19178: _low_level_execute_command(): starting 15621 1726882582.19181: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/AnsiballZ_command.py && sleep 0' 15621 1726882582.20845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.21301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.21314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.21418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.38643: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:36:22.376324", "end": "2024-09-20 21:36:22.383454", "delta": "0:00:00.007130", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882582.40896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882582.41191: stderr chunk (state=3): >>><<< 15621 1726882582.41195: stdout chunk (state=3): >>><<< 15621 1726882582.41198: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:36:22.376324", "end": "2024-09-20 21:36:22.383454", "delta": "0:00:00.007130", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882582.41201: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882582.41203: _low_level_execute_command(): starting 15621 1726882582.41205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.0404718-16140-625437154100/ > /dev/null 2>&1 && sleep 0' 15621 1726882582.42354: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.42539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.42582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.42601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.42731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.46438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.46618: stderr chunk (state=3): >>><<< 15621 1726882582.46647: stdout chunk (state=3): >>><<< 15621 1726882582.46738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.46751: handler run complete 15621 1726882582.46781: Evaluated conditional (False): False 15621 1726882582.46841: attempt loop complete, returning result 15621 1726882582.46868: variable 'item' from source: unknown 15621 1726882582.47042: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.007130", "end": "2024-09-20 21:36:22.383454", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 21:36:22.376324" } 15621 1726882582.47646: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.47649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.47651: variable 'omit' from source: magic vars 15621 1726882582.48349: variable 'ansible_distribution_major_version' from source: facts 15621 1726882582.48353: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882582.50029: variable 'type' from source: set_fact 15621 1726882582.50042: variable 'state' from source: include params 15621 1726882582.50051: variable 'interface' from source: set_fact 15621 1726882582.50059: variable 'current_interfaces' from source: set_fact 15621 1726882582.50071: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15621 1726882582.50125: variable 'omit' from source: magic vars 15621 1726882582.50148: variable 'omit' from source: magic vars 15621 1726882582.50632: variable 'item' from source: unknown 15621 1726882582.50636: variable 'item' from source: unknown 15621 1726882582.50638: variable 'omit' from source: magic vars 15621 1726882582.50641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882582.50644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882582.50646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882582.50648: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882582.51070: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.51074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.51077: Set connection var ansible_connection to ssh 15621 1726882582.51079: Set connection var ansible_shell_executable to /bin/sh 15621 1726882582.51188: Set connection var ansible_timeout to 10 15621 1726882582.51295: Set connection var ansible_shell_type to sh 15621 1726882582.51308: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882582.51317: Set connection var ansible_pipelining to False 15621 1726882582.51350: variable 'ansible_shell_executable' from source: unknown 15621 1726882582.51514: variable 'ansible_connection' from source: unknown 15621 1726882582.51525: variable 'ansible_module_compression' from source: unknown 15621 1726882582.51534: variable 'ansible_shell_type' from source: unknown 15621 1726882582.51542: variable 'ansible_shell_executable' from source: unknown 15621 1726882582.51550: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.51559: variable 'ansible_pipelining' from source: unknown 15621 1726882582.51566: variable 'ansible_timeout' from source: unknown 15621 1726882582.51575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.51691: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882582.51845: variable 'omit' from source: magic vars 15621 1726882582.51853: starting attempt loop 15621 1726882582.51859: running the handler 15621 1726882582.51871: _low_level_execute_command(): starting 15621 1726882582.51879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882582.53201: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.53219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.53376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.53537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.53657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.55559: stdout chunk (state=3): >>>/root <<< 15621 1726882582.55640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.55710: stderr chunk (state=3): >>><<< 15621 1726882582.55828: stdout chunk (state=3): >>><<< 15621 1726882582.55832: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.55840: _low_level_execute_command(): starting 15621 1726882582.55842: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864 `" && echo ansible-tmp-1726882582.5579212-16140-276893472489864="` echo /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864 `" ) && sleep 0' 15621 1726882582.57165: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.57220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.57240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.57259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882582.57277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882582.57332: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.57493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.57539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.57659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.59642: stdout chunk (state=3): >>>ansible-tmp-1726882582.5579212-16140-276893472489864=/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864 <<< 15621 1726882582.59815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.59967: stderr chunk (state=3): >>><<< 15621 1726882582.59970: stdout chunk (state=3): >>><<< 15621 1726882582.60021: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.5579212-16140-276893472489864=/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.60028: variable 'ansible_module_compression' from source: unknown 15621 1726882582.60093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882582.60151: variable 'ansible_facts' from source: unknown 15621 1726882582.60309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py 15621 1726882582.60757: Sending initial data 15621 1726882582.60760: Sent initial data (156 bytes) 15621 1726882582.62165: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.62370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.62398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.62505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.64099: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15621 1726882582.64125: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15621 1726882582.64254: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882582.64536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882582.64684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmppimudt_6 /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py <<< 15621 1726882582.64688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py" <<< 15621 1726882582.64770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmppimudt_6" to remote "/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py" <<< 15621 1726882582.66647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.66650: stdout chunk (state=3): >>><<< 15621 1726882582.66653: stderr chunk (state=3): >>><<< 15621 1726882582.66656: done transferring module to remote 15621 1726882582.66658: _low_level_execute_command(): starting 15621 1726882582.66661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/ /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py && sleep 0' 15621 1726882582.67929: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.67981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.68090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.68228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.68247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.68440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.70240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.70331: stderr chunk (state=3): >>><<< 15621 1726882582.70343: stdout chunk (state=3): >>><<< 15621 1726882582.70375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.70385: _low_level_execute_command(): starting 15621 1726882582.70394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/AnsiballZ_command.py && sleep 0' 15621 1726882582.71667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882582.71687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.71702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.71721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882582.71860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.72138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.72412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.89176: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:36:22.885973", "end": "2024-09-20 21:36:22.889908", "delta": "0:00:00.003935", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882582.91024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882582.91029: stdout chunk (state=3): >>><<< 15621 1726882582.91031: stderr chunk (state=3): >>><<< 15621 1726882582.91034: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:36:22.885973", "end": "2024-09-20 21:36:22.889908", "delta": "0:00:00.003935", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882582.91037: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882582.91043: _low_level_execute_command(): starting 15621 1726882582.91045: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.5579212-16140-276893472489864/ > /dev/null 2>&1 && sleep 0' 15621 1726882582.91971: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882582.92130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.92134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.92137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.92154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.92249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.92440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.92541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.94501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882582.94504: stdout chunk (state=3): >>><<< 15621 1726882582.94512: stderr chunk (state=3): >>><<< 15621 1726882582.94550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882582.94554: handler run complete 15621 1726882582.94556: Evaluated conditional (False): False 15621 1726882582.94727: attempt loop complete, returning result 15621 1726882582.94730: variable 'item' from source: unknown 15621 1726882582.94733: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003935", "end": "2024-09-20 21:36:22.889908", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 21:36:22.885973" } 15621 1726882582.95028: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.95032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.95035: variable 'omit' from source: magic vars 15621 1726882582.95360: variable 'ansible_distribution_major_version' from source: facts 15621 1726882582.95367: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882582.95772: variable 'type' from source: set_fact 15621 1726882582.95780: variable 'state' from source: include params 15621 1726882582.95783: variable 'interface' from source: set_fact 15621 1726882582.95788: variable 'current_interfaces' from source: set_fact 15621 1726882582.95795: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 15621 1726882582.95800: variable 'omit' from source: magic vars 15621 1726882582.95817: variable 'omit' from source: magic vars 15621 1726882582.95862: variable 'item' from source: unknown 15621 1726882582.96132: variable 'item' from source: unknown 15621 1726882582.96151: variable 'omit' from source: magic vars 15621 1726882582.96172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882582.96248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882582.96252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882582.96258: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882582.96260: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.96263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.96300: Set connection var ansible_connection to ssh 15621 1726882582.96313: Set connection var ansible_shell_executable to /bin/sh 15621 1726882582.96317: Set connection var ansible_timeout to 10 15621 1726882582.96319: Set connection var ansible_shell_type to sh 15621 1726882582.96325: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882582.96538: Set connection var ansible_pipelining to False 15621 1726882582.96575: variable 'ansible_shell_executable' from source: unknown 15621 1726882582.96593: variable 'ansible_connection' from source: unknown 15621 1726882582.96600: variable 'ansible_module_compression' from source: unknown 15621 1726882582.96602: variable 'ansible_shell_type' from source: unknown 15621 1726882582.96605: variable 'ansible_shell_executable' from source: unknown 15621 1726882582.96607: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882582.96609: variable 'ansible_pipelining' from source: unknown 15621 1726882582.96611: variable 'ansible_timeout' from source: unknown 15621 1726882582.96613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882582.96792: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882582.96796: variable 'omit' from source: magic vars 15621 1726882582.96799: starting attempt loop 15621 1726882582.96801: running the handler 15621 1726882582.96803: _low_level_execute_command(): starting 15621 1726882582.96805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882582.98002: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882582.98010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882582.98019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882582.98023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882582.98027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882582.98109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882582.98113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882582.98158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882582.98253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882582.99910: stdout chunk (state=3): >>>/root <<< 15621 1726882583.00228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.00232: stdout chunk (state=3): >>><<< 15621 1726882583.00234: stderr chunk (state=3): >>><<< 15621 1726882583.00237: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.00239: _low_level_execute_command(): starting 15621 1726882583.00242: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543 `" && echo ansible-tmp-1726882583.0016947-16140-69537383459543="` echo /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543 `" ) && sleep 0' 15621 1726882583.00851: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882583.00864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882583.00919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.00990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882583.01037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.01221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.03142: stdout chunk (state=3): >>>ansible-tmp-1726882583.0016947-16140-69537383459543=/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543 <<< 15621 1726882583.03364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.03367: stdout chunk (state=3): >>><<< 15621 1726882583.03370: stderr chunk (state=3): >>><<< 15621 1726882583.03829: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882583.0016947-16140-69537383459543=/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.03832: variable 'ansible_module_compression' from source: unknown 15621 1726882583.03835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882583.03837: variable 'ansible_facts' from source: unknown 15621 1726882583.03839: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py 15621 1726882583.04025: Sending initial data 15621 1726882583.04036: Sent initial data (155 bytes) 15621 1726882583.05411: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.05518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.05560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.05680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.07273: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882583.07360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882583.07447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpyh07cvhb /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py <<< 15621 1726882583.07451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py" <<< 15621 1726882583.07551: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpyh07cvhb" to remote "/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py" <<< 15621 1726882583.09044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.09257: stderr chunk (state=3): >>><<< 15621 1726882583.09260: stdout chunk (state=3): >>><<< 15621 1726882583.09288: done transferring module to remote 15621 1726882583.09297: _low_level_execute_command(): starting 15621 1726882583.09302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/ /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py && sleep 0' 15621 1726882583.10641: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.10874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882583.10878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.10880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.11144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.12847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.12882: stderr chunk (state=3): >>><<< 15621 1726882583.12896: stdout chunk (state=3): >>><<< 15621 1726882583.12916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.12919: _low_level_execute_command(): starting 15621 1726882583.12925: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/AnsiballZ_command.py && sleep 0' 15621 1726882583.14256: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882583.14272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882583.14294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882583.14336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882583.14358: stderr chunk (state=3): >>>debug2: match found <<< 15621 1726882583.14471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.14589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.14615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.14754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.31763: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:36:23.308954", "end": "2024-09-20 21:36:23.312794", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882583.33046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882583.33059: stderr chunk (state=3): >>><<< 15621 1726882583.33071: stdout chunk (state=3): >>><<< 15621 1726882583.33096: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:36:23.308954", "end": "2024-09-20 21:36:23.312794", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882583.33142: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882583.33312: _low_level_execute_command(): starting 15621 1726882583.33316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882583.0016947-16140-69537383459543/ > /dev/null 2>&1 && sleep 0' 15621 1726882583.34642: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.34713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.34794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.36843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.36884: stderr chunk (state=3): >>><<< 15621 1726882583.37229: stdout chunk (state=3): >>><<< 15621 1726882583.37233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.37236: handler run complete 15621 1726882583.37238: Evaluated conditional (False): False 15621 1726882583.37240: attempt loop complete, returning result 15621 1726882583.37242: variable 'item' from source: unknown 15621 1726882583.37244: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003840", "end": "2024-09-20 21:36:23.312794", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 21:36:23.308954" } 15621 1726882583.37729: dumping result to json 15621 1726882583.37733: done dumping result, returning 15621 1726882583.37736: done running TaskExecutor() for managed_node3/TASK: Create veth interface lsr27 [0affc7ec-ae25-af1a-5b92-000000000135] 15621 1726882583.37738: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000135 15621 1726882583.37791: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000135 15621 1726882583.37795: WORKER PROCESS EXITING 15621 1726882583.37889: no more pending results, returning what we have 15621 1726882583.37893: results queue empty 15621 1726882583.37894: checking for any_errors_fatal 15621 1726882583.37898: done checking for any_errors_fatal 15621 1726882583.37899: checking for max_fail_percentage 15621 1726882583.37900: done checking for max_fail_percentage 15621 1726882583.37901: checking to see if all hosts have failed and the running result is not ok 15621 1726882583.37903: done checking to see if all hosts have failed 15621 1726882583.37904: getting the remaining hosts for this loop 15621 1726882583.37905: done getting the remaining hosts for this loop 15621 1726882583.37909: getting the next task for host managed_node3 15621 1726882583.37916: done getting next task for host managed_node3 15621 1726882583.37919: ^ task is: TASK: Set up veth as managed by NetworkManager 15621 1726882583.37924: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882583.37928: getting variables 15621 1726882583.37930: in VariableManager get_vars() 15621 1726882583.38294: Calling all_inventory to load vars for managed_node3 15621 1726882583.38298: Calling groups_inventory to load vars for managed_node3 15621 1726882583.38302: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882583.38314: Calling all_plugins_play to load vars for managed_node3 15621 1726882583.38317: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882583.38320: Calling groups_plugins_play to load vars for managed_node3 15621 1726882583.39436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882583.39890: done with get_vars() 15621 1726882583.39901: done getting variables 15621 1726882583.40089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:36:23 -0400 (0:00:01.471) 0:00:15.480 ****** 15621 1726882583.40120: entering _queue_task() for managed_node3/command 15621 1726882583.41036: worker is 1 (out of 1 available) 15621 1726882583.41047: exiting _queue_task() for managed_node3/command 15621 1726882583.41059: done queuing things up, now waiting for results queue to drain 15621 1726882583.41061: waiting for pending results... 15621 1726882583.41591: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 15621 1726882583.42329: in run() - task 0affc7ec-ae25-af1a-5b92-000000000136 15621 1726882583.42334: variable 'ansible_search_path' from source: unknown 15621 1726882583.42338: variable 'ansible_search_path' from source: unknown 15621 1726882583.42345: calling self._execute() 15621 1726882583.42348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882583.42351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882583.42354: variable 'omit' from source: magic vars 15621 1726882583.43830: variable 'ansible_distribution_major_version' from source: facts 15621 1726882583.44130: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882583.44135: variable 'type' from source: set_fact 15621 1726882583.44634: variable 'state' from source: include params 15621 1726882583.44640: Evaluated conditional (type == 'veth' and state == 'present'): True 15621 1726882583.44643: variable 'omit' from source: magic vars 15621 1726882583.44647: variable 'omit' from source: magic vars 15621 1726882583.44720: variable 'interface' from source: set_fact 15621 1726882583.45327: variable 'omit' from source: magic vars 15621 1726882583.45331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882583.45334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882583.45336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882583.45339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882583.45342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882583.45344: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882583.45346: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882583.45349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882583.45837: Set connection var ansible_connection to ssh 15621 1726882583.45855: Set connection var ansible_shell_executable to /bin/sh 15621 1726882583.45866: Set connection var ansible_timeout to 10 15621 1726882583.45873: Set connection var ansible_shell_type to sh 15621 1726882583.45886: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882583.45897: Set connection var ansible_pipelining to False 15621 1726882583.46157: variable 'ansible_shell_executable' from source: unknown 15621 1726882583.46228: variable 'ansible_connection' from source: unknown 15621 1726882583.46233: variable 'ansible_module_compression' from source: unknown 15621 1726882583.46236: variable 'ansible_shell_type' from source: unknown 15621 1726882583.46240: variable 'ansible_shell_executable' from source: unknown 15621 1726882583.46243: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882583.46247: variable 'ansible_pipelining' from source: unknown 15621 1726882583.46249: variable 'ansible_timeout' from source: unknown 15621 1726882583.46252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882583.46369: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882583.47004: variable 'omit' from source: magic vars 15621 1726882583.47007: starting attempt loop 15621 1726882583.47010: running the handler 15621 1726882583.47012: _low_level_execute_command(): starting 15621 1726882583.47014: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882583.48538: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882583.48542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.48841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.49035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.49437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.50910: stdout chunk (state=3): >>>/root <<< 15621 1726882583.51190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.51210: stderr chunk (state=3): >>><<< 15621 1726882583.51219: stdout chunk (state=3): >>><<< 15621 1726882583.51259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.51282: _low_level_execute_command(): starting 15621 1726882583.51336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206 `" && echo ansible-tmp-1726882583.5126643-16182-185389647083206="` echo /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206 `" ) && sleep 0' 15621 1726882583.53198: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882583.53558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.53791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.54005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.55981: stdout chunk (state=3): >>>ansible-tmp-1726882583.5126643-16182-185389647083206=/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206 <<< 15621 1726882583.56230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.56243: stdout chunk (state=3): >>><<< 15621 1726882583.56259: stderr chunk (state=3): >>><<< 15621 1726882583.56283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882583.5126643-16182-185389647083206=/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.56576: variable 'ansible_module_compression' from source: unknown 15621 1726882583.56601: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882583.56727: variable 'ansible_facts' from source: unknown 15621 1726882583.56943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py 15621 1726882583.57456: Sending initial data 15621 1726882583.57465: Sent initial data (156 bytes) 15621 1726882583.59339: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.59382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882583.59638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.59659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.59768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.61389: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882583.61506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882583.61616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp86ax3cnm /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py <<< 15621 1726882583.61628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py" <<< 15621 1726882583.61704: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp86ax3cnm" to remote "/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py" <<< 15621 1726882583.63413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.63430: stdout chunk (state=3): >>><<< 15621 1726882583.63442: stderr chunk (state=3): >>><<< 15621 1726882583.63468: done transferring module to remote 15621 1726882583.63517: _low_level_execute_command(): starting 15621 1726882583.63693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/ /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py && sleep 0' 15621 1726882583.64797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882583.64800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.64803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882583.64921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882583.64927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.65094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.65200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.67005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.67350: stderr chunk (state=3): >>><<< 15621 1726882583.67354: stdout chunk (state=3): >>><<< 15621 1726882583.67429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.67437: _low_level_execute_command(): starting 15621 1726882583.67440: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/AnsiballZ_command.py && sleep 0' 15621 1726882583.68921: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882583.69238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882583.69263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.69284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.69459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.87865: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:36:23.856642", "end": "2024-09-20 21:36:23.875061", "delta": "0:00:00.018419", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882583.89345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882583.89691: stderr chunk (state=3): >>><<< 15621 1726882583.89695: stdout chunk (state=3): >>><<< 15621 1726882583.89700: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:36:23.856642", "end": "2024-09-20 21:36:23.875061", "delta": "0:00:00.018419", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882583.89703: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882583.89705: _low_level_execute_command(): starting 15621 1726882583.89707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882583.5126643-16182-185389647083206/ > /dev/null 2>&1 && sleep 0' 15621 1726882583.91053: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882583.91056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882583.91059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882583.91061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882583.91063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.91065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882583.91067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882583.91069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882583.91229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882583.91241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882583.91283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882583.91451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882583.93445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882583.93535: stderr chunk (state=3): >>><<< 15621 1726882583.93554: stdout chunk (state=3): >>><<< 15621 1726882583.93593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882583.93653: handler run complete 15621 1726882583.93834: Evaluated conditional (False): False 15621 1726882583.93838: attempt loop complete, returning result 15621 1726882583.93840: _execute() done 15621 1726882583.93843: dumping result to json 15621 1726882583.93940: done dumping result, returning 15621 1726882583.93944: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0affc7ec-ae25-af1a-5b92-000000000136] 15621 1726882583.93946: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000136 15621 1726882583.94046: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000136 15621 1726882583.94050: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.018419", "end": "2024-09-20 21:36:23.875061", "rc": 0, "start": "2024-09-20 21:36:23.856642" } 15621 1726882583.94224: no more pending results, returning what we have 15621 1726882583.94228: results queue empty 15621 1726882583.94229: checking for any_errors_fatal 15621 1726882583.94246: done checking for any_errors_fatal 15621 1726882583.94247: checking for max_fail_percentage 15621 1726882583.94250: done checking for max_fail_percentage 15621 1726882583.94251: checking to see if all hosts have failed and the running result is not ok 15621 1726882583.94253: done checking to see if all hosts have failed 15621 1726882583.94253: getting the remaining hosts for this loop 15621 1726882583.94255: done getting the remaining hosts for this loop 15621 1726882583.94260: getting the next task for host managed_node3 15621 1726882583.94266: done getting next task for host managed_node3 15621 1726882583.94269: ^ task is: TASK: Delete veth interface {{ interface }} 15621 1726882583.94273: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882583.94277: getting variables 15621 1726882583.94279: in VariableManager get_vars() 15621 1726882583.94321: Calling all_inventory to load vars for managed_node3 15621 1726882583.94600: Calling groups_inventory to load vars for managed_node3 15621 1726882583.94606: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882583.94625: Calling all_plugins_play to load vars for managed_node3 15621 1726882583.94630: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882583.94636: Calling groups_plugins_play to load vars for managed_node3 15621 1726882583.95272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882583.95724: done with get_vars() 15621 1726882583.95736: done getting variables 15621 1726882583.95803: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882583.96137: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:36:23 -0400 (0:00:00.561) 0:00:16.041 ****** 15621 1726882583.96261: entering _queue_task() for managed_node3/command 15621 1726882583.97141: worker is 1 (out of 1 available) 15621 1726882583.97243: exiting _queue_task() for managed_node3/command 15621 1726882583.97257: done queuing things up, now waiting for results queue to drain 15621 1726882583.97264: waiting for pending results... 15621 1726882583.97839: running TaskExecutor() for managed_node3/TASK: Delete veth interface lsr27 15621 1726882583.97843: in run() - task 0affc7ec-ae25-af1a-5b92-000000000137 15621 1726882583.97846: variable 'ansible_search_path' from source: unknown 15621 1726882583.97849: variable 'ansible_search_path' from source: unknown 15621 1726882583.97873: calling self._execute() 15621 1726882583.98328: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882583.98332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882583.98335: variable 'omit' from source: magic vars 15621 1726882583.98904: variable 'ansible_distribution_major_version' from source: facts 15621 1726882583.99071: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882583.99286: variable 'type' from source: set_fact 15621 1726882583.99544: variable 'state' from source: include params 15621 1726882583.99563: variable 'interface' from source: set_fact 15621 1726882583.99574: variable 'current_interfaces' from source: set_fact 15621 1726882583.99590: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 15621 1726882583.99599: when evaluation is False, skipping this task 15621 1726882583.99607: _execute() done 15621 1726882583.99615: dumping result to json 15621 1726882583.99656: done dumping result, returning 15621 1726882583.99830: done running TaskExecutor() for managed_node3/TASK: Delete veth interface lsr27 [0affc7ec-ae25-af1a-5b92-000000000137] 15621 1726882583.99833: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000137 15621 1726882584.00057: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000137 15621 1726882584.00062: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15621 1726882584.00126: no more pending results, returning what we have 15621 1726882584.00135: results queue empty 15621 1726882584.00136: checking for any_errors_fatal 15621 1726882584.00149: done checking for any_errors_fatal 15621 1726882584.00150: checking for max_fail_percentage 15621 1726882584.00153: done checking for max_fail_percentage 15621 1726882584.00154: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.00155: done checking to see if all hosts have failed 15621 1726882584.00156: getting the remaining hosts for this loop 15621 1726882584.00157: done getting the remaining hosts for this loop 15621 1726882584.00163: getting the next task for host managed_node3 15621 1726882584.00169: done getting next task for host managed_node3 15621 1726882584.00177: ^ task is: TASK: Create dummy interface {{ interface }} 15621 1726882584.00181: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.00186: getting variables 15621 1726882584.00187: in VariableManager get_vars() 15621 1726882584.00220: Calling all_inventory to load vars for managed_node3 15621 1726882584.00394: Calling groups_inventory to load vars for managed_node3 15621 1726882584.00400: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.00412: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.00416: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.00419: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.00959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.01497: done with get_vars() 15621 1726882584.01509: done getting variables 15621 1726882584.01628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882584.01818: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:36:24 -0400 (0:00:00.056) 0:00:16.098 ****** 15621 1726882584.01896: entering _queue_task() for managed_node3/command 15621 1726882584.02553: worker is 1 (out of 1 available) 15621 1726882584.02573: exiting _queue_task() for managed_node3/command 15621 1726882584.02837: done queuing things up, now waiting for results queue to drain 15621 1726882584.02841: waiting for pending results... 15621 1726882584.03150: running TaskExecutor() for managed_node3/TASK: Create dummy interface lsr27 15621 1726882584.03195: in run() - task 0affc7ec-ae25-af1a-5b92-000000000138 15621 1726882584.03208: variable 'ansible_search_path' from source: unknown 15621 1726882584.03212: variable 'ansible_search_path' from source: unknown 15621 1726882584.03261: calling self._execute() 15621 1726882584.03457: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.03471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.03481: variable 'omit' from source: magic vars 15621 1726882584.04528: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.04532: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.04754: variable 'type' from source: set_fact 15621 1726882584.04757: variable 'state' from source: include params 15621 1726882584.04760: variable 'interface' from source: set_fact 15621 1726882584.04763: variable 'current_interfaces' from source: set_fact 15621 1726882584.04765: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 15621 1726882584.04768: when evaluation is False, skipping this task 15621 1726882584.04771: _execute() done 15621 1726882584.04774: dumping result to json 15621 1726882584.04776: done dumping result, returning 15621 1726882584.04779: done running TaskExecutor() for managed_node3/TASK: Create dummy interface lsr27 [0affc7ec-ae25-af1a-5b92-000000000138] 15621 1726882584.04781: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000138 15621 1726882584.04862: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000138 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 15621 1726882584.05152: no more pending results, returning what we have 15621 1726882584.05158: results queue empty 15621 1726882584.05159: checking for any_errors_fatal 15621 1726882584.05166: done checking for any_errors_fatal 15621 1726882584.05166: checking for max_fail_percentage 15621 1726882584.05168: done checking for max_fail_percentage 15621 1726882584.05169: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.05173: done checking to see if all hosts have failed 15621 1726882584.05174: getting the remaining hosts for this loop 15621 1726882584.05176: done getting the remaining hosts for this loop 15621 1726882584.05180: getting the next task for host managed_node3 15621 1726882584.05187: done getting next task for host managed_node3 15621 1726882584.05190: ^ task is: TASK: Delete dummy interface {{ interface }} 15621 1726882584.05194: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.05199: getting variables 15621 1726882584.05201: in VariableManager get_vars() 15621 1726882584.05239: Calling all_inventory to load vars for managed_node3 15621 1726882584.05242: Calling groups_inventory to load vars for managed_node3 15621 1726882584.05247: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.05483: WORKER PROCESS EXITING 15621 1726882584.05509: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.05513: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.05517: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.06218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.06921: done with get_vars() 15621 1726882584.06938: done getting variables 15621 1726882584.07019: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882584.07143: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:36:24 -0400 (0:00:00.052) 0:00:16.150 ****** 15621 1726882584.07168: entering _queue_task() for managed_node3/command 15621 1726882584.07464: worker is 1 (out of 1 available) 15621 1726882584.07478: exiting _queue_task() for managed_node3/command 15621 1726882584.07493: done queuing things up, now waiting for results queue to drain 15621 1726882584.07495: waiting for pending results... 15621 1726882584.07681: running TaskExecutor() for managed_node3/TASK: Delete dummy interface lsr27 15621 1726882584.07762: in run() - task 0affc7ec-ae25-af1a-5b92-000000000139 15621 1726882584.07776: variable 'ansible_search_path' from source: unknown 15621 1726882584.07780: variable 'ansible_search_path' from source: unknown 15621 1726882584.07820: calling self._execute() 15621 1726882584.07898: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.07904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.07912: variable 'omit' from source: magic vars 15621 1726882584.08231: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.08241: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.08399: variable 'type' from source: set_fact 15621 1726882584.08402: variable 'state' from source: include params 15621 1726882584.08407: variable 'interface' from source: set_fact 15621 1726882584.08411: variable 'current_interfaces' from source: set_fact 15621 1726882584.08424: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 15621 1726882584.08428: when evaluation is False, skipping this task 15621 1726882584.08433: _execute() done 15621 1726882584.08435: dumping result to json 15621 1726882584.08438: done dumping result, returning 15621 1726882584.08443: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface lsr27 [0affc7ec-ae25-af1a-5b92-000000000139] 15621 1726882584.08449: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000139 15621 1726882584.08538: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000139 15621 1726882584.08542: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15621 1726882584.08594: no more pending results, returning what we have 15621 1726882584.08601: results queue empty 15621 1726882584.08602: checking for any_errors_fatal 15621 1726882584.08607: done checking for any_errors_fatal 15621 1726882584.08608: checking for max_fail_percentage 15621 1726882584.08610: done checking for max_fail_percentage 15621 1726882584.08611: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.08612: done checking to see if all hosts have failed 15621 1726882584.08613: getting the remaining hosts for this loop 15621 1726882584.08614: done getting the remaining hosts for this loop 15621 1726882584.08618: getting the next task for host managed_node3 15621 1726882584.08625: done getting next task for host managed_node3 15621 1726882584.08627: ^ task is: TASK: Create tap interface {{ interface }} 15621 1726882584.08630: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.08634: getting variables 15621 1726882584.08635: in VariableManager get_vars() 15621 1726882584.08662: Calling all_inventory to load vars for managed_node3 15621 1726882584.08664: Calling groups_inventory to load vars for managed_node3 15621 1726882584.08667: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.08682: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.08684: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.08688: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.08953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.09126: done with get_vars() 15621 1726882584.09134: done getting variables 15621 1726882584.09207: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882584.09316: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:36:24 -0400 (0:00:00.021) 0:00:16.172 ****** 15621 1726882584.09347: entering _queue_task() for managed_node3/command 15621 1726882584.09625: worker is 1 (out of 1 available) 15621 1726882584.09639: exiting _queue_task() for managed_node3/command 15621 1726882584.09836: done queuing things up, now waiting for results queue to drain 15621 1726882584.09838: waiting for pending results... 15621 1726882584.09990: running TaskExecutor() for managed_node3/TASK: Create tap interface lsr27 15621 1726882584.10385: in run() - task 0affc7ec-ae25-af1a-5b92-00000000013a 15621 1726882584.10389: variable 'ansible_search_path' from source: unknown 15621 1726882584.10394: variable 'ansible_search_path' from source: unknown 15621 1726882584.10577: calling self._execute() 15621 1726882584.10715: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.10719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.10731: variable 'omit' from source: magic vars 15621 1726882584.11460: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.11484: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.11651: variable 'type' from source: set_fact 15621 1726882584.11655: variable 'state' from source: include params 15621 1726882584.11659: variable 'interface' from source: set_fact 15621 1726882584.11664: variable 'current_interfaces' from source: set_fact 15621 1726882584.11680: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 15621 1726882584.11683: when evaluation is False, skipping this task 15621 1726882584.11686: _execute() done 15621 1726882584.11688: dumping result to json 15621 1726882584.11691: done dumping result, returning 15621 1726882584.11694: done running TaskExecutor() for managed_node3/TASK: Create tap interface lsr27 [0affc7ec-ae25-af1a-5b92-00000000013a] 15621 1726882584.11700: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000013a 15621 1726882584.11792: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000013a 15621 1726882584.11796: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 15621 1726882584.11852: no more pending results, returning what we have 15621 1726882584.11855: results queue empty 15621 1726882584.11857: checking for any_errors_fatal 15621 1726882584.11861: done checking for any_errors_fatal 15621 1726882584.11862: checking for max_fail_percentage 15621 1726882584.11864: done checking for max_fail_percentage 15621 1726882584.11865: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.11866: done checking to see if all hosts have failed 15621 1726882584.11866: getting the remaining hosts for this loop 15621 1726882584.11867: done getting the remaining hosts for this loop 15621 1726882584.11874: getting the next task for host managed_node3 15621 1726882584.11879: done getting next task for host managed_node3 15621 1726882584.11881: ^ task is: TASK: Delete tap interface {{ interface }} 15621 1726882584.11885: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.11889: getting variables 15621 1726882584.11890: in VariableManager get_vars() 15621 1726882584.11916: Calling all_inventory to load vars for managed_node3 15621 1726882584.11919: Calling groups_inventory to load vars for managed_node3 15621 1726882584.11923: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.11937: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.11940: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.11943: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.12099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.12229: done with get_vars() 15621 1726882584.12239: done getting variables 15621 1726882584.12286: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882584.12373: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:36:24 -0400 (0:00:00.030) 0:00:16.203 ****** 15621 1726882584.12394: entering _queue_task() for managed_node3/command 15621 1726882584.12596: worker is 1 (out of 1 available) 15621 1726882584.12610: exiting _queue_task() for managed_node3/command 15621 1726882584.12625: done queuing things up, now waiting for results queue to drain 15621 1726882584.12627: waiting for pending results... 15621 1726882584.12803: running TaskExecutor() for managed_node3/TASK: Delete tap interface lsr27 15621 1726882584.12901: in run() - task 0affc7ec-ae25-af1a-5b92-00000000013b 15621 1726882584.12914: variable 'ansible_search_path' from source: unknown 15621 1726882584.12918: variable 'ansible_search_path' from source: unknown 15621 1726882584.12983: calling self._execute() 15621 1726882584.13052: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.13055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.13061: variable 'omit' from source: magic vars 15621 1726882584.13459: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.13530: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.13686: variable 'type' from source: set_fact 15621 1726882584.13689: variable 'state' from source: include params 15621 1726882584.13694: variable 'interface' from source: set_fact 15621 1726882584.13697: variable 'current_interfaces' from source: set_fact 15621 1726882584.13708: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 15621 1726882584.13711: when evaluation is False, skipping this task 15621 1726882584.13714: _execute() done 15621 1726882584.13717: dumping result to json 15621 1726882584.13777: done dumping result, returning 15621 1726882584.13785: done running TaskExecutor() for managed_node3/TASK: Delete tap interface lsr27 [0affc7ec-ae25-af1a-5b92-00000000013b] 15621 1726882584.13788: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000013b 15621 1726882584.13859: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000013b 15621 1726882584.13862: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 15621 1726882584.13928: no more pending results, returning what we have 15621 1726882584.13931: results queue empty 15621 1726882584.13932: checking for any_errors_fatal 15621 1726882584.13936: done checking for any_errors_fatal 15621 1726882584.13937: checking for max_fail_percentage 15621 1726882584.13939: done checking for max_fail_percentage 15621 1726882584.13940: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.13941: done checking to see if all hosts have failed 15621 1726882584.13941: getting the remaining hosts for this loop 15621 1726882584.13942: done getting the remaining hosts for this loop 15621 1726882584.13945: getting the next task for host managed_node3 15621 1726882584.13956: done getting next task for host managed_node3 15621 1726882584.13961: ^ task is: TASK: Include the task 'assert_device_present.yml' 15621 1726882584.13965: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.13969: getting variables 15621 1726882584.13973: in VariableManager get_vars() 15621 1726882584.13999: Calling all_inventory to load vars for managed_node3 15621 1726882584.14002: Calling groups_inventory to load vars for managed_node3 15621 1726882584.14005: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.14015: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.14018: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.14021: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.14298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.14518: done with get_vars() 15621 1726882584.14527: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 21:36:24 -0400 (0:00:00.022) 0:00:16.225 ****** 15621 1726882584.14615: entering _queue_task() for managed_node3/include_tasks 15621 1726882584.14849: worker is 1 (out of 1 available) 15621 1726882584.14861: exiting _queue_task() for managed_node3/include_tasks 15621 1726882584.14875: done queuing things up, now waiting for results queue to drain 15621 1726882584.14877: waiting for pending results... 15621 1726882584.15325: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 15621 1726882584.15331: in run() - task 0affc7ec-ae25-af1a-5b92-000000000012 15621 1726882584.15335: variable 'ansible_search_path' from source: unknown 15621 1726882584.15355: calling self._execute() 15621 1726882584.15456: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.15464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.15478: variable 'omit' from source: magic vars 15621 1726882584.15902: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.15918: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.15927: _execute() done 15621 1726882584.15930: dumping result to json 15621 1726882584.15979: done dumping result, returning 15621 1726882584.15983: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0affc7ec-ae25-af1a-5b92-000000000012] 15621 1726882584.15986: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000012 15621 1726882584.16090: no more pending results, returning what we have 15621 1726882584.16096: in VariableManager get_vars() 15621 1726882584.16135: Calling all_inventory to load vars for managed_node3 15621 1726882584.16138: Calling groups_inventory to load vars for managed_node3 15621 1726882584.16142: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.16159: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.16163: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.16167: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.16450: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000012 15621 1726882584.16458: WORKER PROCESS EXITING 15621 1726882584.16476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.16599: done with get_vars() 15621 1726882584.16605: variable 'ansible_search_path' from source: unknown 15621 1726882584.16614: we have included files to process 15621 1726882584.16615: generating all_blocks data 15621 1726882584.16616: done generating all_blocks data 15621 1726882584.16620: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15621 1726882584.16623: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15621 1726882584.16626: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15621 1726882584.16737: in VariableManager get_vars() 15621 1726882584.16749: done with get_vars() 15621 1726882584.16849: done processing included file 15621 1726882584.16850: iterating over new_blocks loaded from include file 15621 1726882584.16851: in VariableManager get_vars() 15621 1726882584.16859: done with get_vars() 15621 1726882584.16860: filtering new block on tags 15621 1726882584.16873: done filtering new block on tags 15621 1726882584.16875: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 15621 1726882584.16878: extending task lists for all hosts with included blocks 15621 1726882584.17450: done extending task lists 15621 1726882584.17451: done processing included files 15621 1726882584.17454: results queue empty 15621 1726882584.17455: checking for any_errors_fatal 15621 1726882584.17458: done checking for any_errors_fatal 15621 1726882584.17459: checking for max_fail_percentage 15621 1726882584.17460: done checking for max_fail_percentage 15621 1726882584.17461: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.17462: done checking to see if all hosts have failed 15621 1726882584.17463: getting the remaining hosts for this loop 15621 1726882584.17464: done getting the remaining hosts for this loop 15621 1726882584.17478: getting the next task for host managed_node3 15621 1726882584.17482: done getting next task for host managed_node3 15621 1726882584.17485: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15621 1726882584.17488: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.17490: getting variables 15621 1726882584.17491: in VariableManager get_vars() 15621 1726882584.17502: Calling all_inventory to load vars for managed_node3 15621 1726882584.17505: Calling groups_inventory to load vars for managed_node3 15621 1726882584.17507: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.17513: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.17515: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.17518: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.17607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.17743: done with get_vars() 15621 1726882584.17756: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:36:24 -0400 (0:00:00.032) 0:00:16.257 ****** 15621 1726882584.17841: entering _queue_task() for managed_node3/include_tasks 15621 1726882584.18086: worker is 1 (out of 1 available) 15621 1726882584.18099: exiting _queue_task() for managed_node3/include_tasks 15621 1726882584.18112: done queuing things up, now waiting for results queue to drain 15621 1726882584.18114: waiting for pending results... 15621 1726882584.18354: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 15621 1726882584.18949: in run() - task 0affc7ec-ae25-af1a-5b92-0000000001d3 15621 1726882584.18953: variable 'ansible_search_path' from source: unknown 15621 1726882584.18957: variable 'ansible_search_path' from source: unknown 15621 1726882584.18960: calling self._execute() 15621 1726882584.18963: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.18966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.18968: variable 'omit' from source: magic vars 15621 1726882584.19613: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.19629: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.19638: _execute() done 15621 1726882584.19642: dumping result to json 15621 1726882584.19673: done dumping result, returning 15621 1726882584.19690: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-af1a-5b92-0000000001d3] 15621 1726882584.19694: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000001d3 15621 1726882584.19827: no more pending results, returning what we have 15621 1726882584.19834: in VariableManager get_vars() 15621 1726882584.19876: Calling all_inventory to load vars for managed_node3 15621 1726882584.19881: Calling groups_inventory to load vars for managed_node3 15621 1726882584.19885: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.19907: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.19913: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.19917: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.20383: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000001d3 15621 1726882584.20388: WORKER PROCESS EXITING 15621 1726882584.20406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.20642: done with get_vars() 15621 1726882584.20651: variable 'ansible_search_path' from source: unknown 15621 1726882584.20652: variable 'ansible_search_path' from source: unknown 15621 1726882584.20693: we have included files to process 15621 1726882584.20694: generating all_blocks data 15621 1726882584.20696: done generating all_blocks data 15621 1726882584.20699: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882584.20700: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882584.20703: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882584.20887: done processing included file 15621 1726882584.20889: iterating over new_blocks loaded from include file 15621 1726882584.20890: in VariableManager get_vars() 15621 1726882584.20901: done with get_vars() 15621 1726882584.20903: filtering new block on tags 15621 1726882584.20913: done filtering new block on tags 15621 1726882584.20918: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 15621 1726882584.20925: extending task lists for all hosts with included blocks 15621 1726882584.20995: done extending task lists 15621 1726882584.20996: done processing included files 15621 1726882584.20996: results queue empty 15621 1726882584.20997: checking for any_errors_fatal 15621 1726882584.21000: done checking for any_errors_fatal 15621 1726882584.21001: checking for max_fail_percentage 15621 1726882584.21002: done checking for max_fail_percentage 15621 1726882584.21003: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.21004: done checking to see if all hosts have failed 15621 1726882584.21005: getting the remaining hosts for this loop 15621 1726882584.21005: done getting the remaining hosts for this loop 15621 1726882584.21007: getting the next task for host managed_node3 15621 1726882584.21011: done getting next task for host managed_node3 15621 1726882584.21013: ^ task is: TASK: Get stat for interface {{ interface }} 15621 1726882584.21015: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.21016: getting variables 15621 1726882584.21017: in VariableManager get_vars() 15621 1726882584.21024: Calling all_inventory to load vars for managed_node3 15621 1726882584.21026: Calling groups_inventory to load vars for managed_node3 15621 1726882584.21028: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.21031: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.21033: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.21035: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.21125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.21257: done with get_vars() 15621 1726882584.21263: done getting variables 15621 1726882584.21382: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:24 -0400 (0:00:00.035) 0:00:16.293 ****** 15621 1726882584.21404: entering _queue_task() for managed_node3/stat 15621 1726882584.21624: worker is 1 (out of 1 available) 15621 1726882584.21638: exiting _queue_task() for managed_node3/stat 15621 1726882584.21651: done queuing things up, now waiting for results queue to drain 15621 1726882584.21652: waiting for pending results... 15621 1726882584.21815: running TaskExecutor() for managed_node3/TASK: Get stat for interface lsr27 15621 1726882584.21894: in run() - task 0affc7ec-ae25-af1a-5b92-00000000021e 15621 1726882584.21905: variable 'ansible_search_path' from source: unknown 15621 1726882584.21909: variable 'ansible_search_path' from source: unknown 15621 1726882584.21941: calling self._execute() 15621 1726882584.22004: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.22010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.22018: variable 'omit' from source: magic vars 15621 1726882584.22290: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.22300: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.22306: variable 'omit' from source: magic vars 15621 1726882584.22344: variable 'omit' from source: magic vars 15621 1726882584.22414: variable 'interface' from source: set_fact 15621 1726882584.22439: variable 'omit' from source: magic vars 15621 1726882584.22478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882584.22519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882584.22547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882584.22578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.22582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.22629: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882584.22634: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.22637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.22927: Set connection var ansible_connection to ssh 15621 1726882584.22931: Set connection var ansible_shell_executable to /bin/sh 15621 1726882584.22933: Set connection var ansible_timeout to 10 15621 1726882584.22936: Set connection var ansible_shell_type to sh 15621 1726882584.22938: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882584.22940: Set connection var ansible_pipelining to False 15621 1726882584.22943: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.22945: variable 'ansible_connection' from source: unknown 15621 1726882584.22948: variable 'ansible_module_compression' from source: unknown 15621 1726882584.22950: variable 'ansible_shell_type' from source: unknown 15621 1726882584.22953: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.22955: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.22957: variable 'ansible_pipelining' from source: unknown 15621 1726882584.22959: variable 'ansible_timeout' from source: unknown 15621 1726882584.22961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.23004: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882584.23014: variable 'omit' from source: magic vars 15621 1726882584.23020: starting attempt loop 15621 1726882584.23025: running the handler 15621 1726882584.23087: _low_level_execute_command(): starting 15621 1726882584.23091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882584.23721: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882584.23731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.23869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.23989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.25733: stdout chunk (state=3): >>>/root <<< 15621 1726882584.25841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.25892: stderr chunk (state=3): >>><<< 15621 1726882584.25895: stdout chunk (state=3): >>><<< 15621 1726882584.25918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.25933: _low_level_execute_command(): starting 15621 1726882584.25940: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062 `" && echo ansible-tmp-1726882584.2591746-16226-124426523688062="` echo /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062 `" ) && sleep 0' 15621 1726882584.26558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.26573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.26691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.28645: stdout chunk (state=3): >>>ansible-tmp-1726882584.2591746-16226-124426523688062=/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062 <<< 15621 1726882584.28767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.28816: stderr chunk (state=3): >>><<< 15621 1726882584.28820: stdout chunk (state=3): >>><<< 15621 1726882584.28838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882584.2591746-16226-124426523688062=/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.28880: variable 'ansible_module_compression' from source: unknown 15621 1726882584.28928: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15621 1726882584.28960: variable 'ansible_facts' from source: unknown 15621 1726882584.29027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py 15621 1726882584.29134: Sending initial data 15621 1726882584.29138: Sent initial data (153 bytes) 15621 1726882584.29609: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.29613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882584.29616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.29618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.29623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.29666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.29669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.29763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.31346: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882584.31431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882584.31515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpaqqo7q62 /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py <<< 15621 1726882584.31518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py" <<< 15621 1726882584.31595: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpaqqo7q62" to remote "/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py" <<< 15621 1726882584.31601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py" <<< 15621 1726882584.32328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.32394: stderr chunk (state=3): >>><<< 15621 1726882584.32398: stdout chunk (state=3): >>><<< 15621 1726882584.32425: done transferring module to remote 15621 1726882584.32434: _low_level_execute_command(): starting 15621 1726882584.32440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/ /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py && sleep 0' 15621 1726882584.32911: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.32915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.32917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.32920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.32982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.32985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.33063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.34875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.34932: stderr chunk (state=3): >>><<< 15621 1726882584.34935: stdout chunk (state=3): >>><<< 15621 1726882584.34951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.34954: _low_level_execute_command(): starting 15621 1726882584.34957: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/AnsiballZ_stat.py && sleep 0' 15621 1726882584.35441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.35445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882584.35447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882584.35451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.35453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.35498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.35503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.35598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.52139: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38370, "dev": 23, "nlink": 1, "atime": 1726882582.3799071, "mtime": 1726882582.3799071, "ctime": 1726882582.3799071, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15621 1726882584.53679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882584.53683: stdout chunk (state=3): >>><<< 15621 1726882584.53685: stderr chunk (state=3): >>><<< 15621 1726882584.53687: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38370, "dev": 23, "nlink": 1, "atime": 1726882582.3799071, "mtime": 1726882582.3799071, "ctime": 1726882582.3799071, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882584.53783: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882584.53787: _low_level_execute_command(): starting 15621 1726882584.53789: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882584.2591746-16226-124426523688062/ > /dev/null 2>&1 && sleep 0' 15621 1726882584.54463: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882584.54483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.54499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.54568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.54674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.54678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.54720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.54815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.56884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.56888: stdout chunk (state=3): >>><<< 15621 1726882584.56891: stderr chunk (state=3): >>><<< 15621 1726882584.57075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.57079: handler run complete 15621 1726882584.57082: attempt loop complete, returning result 15621 1726882584.57084: _execute() done 15621 1726882584.57086: dumping result to json 15621 1726882584.57088: done dumping result, returning 15621 1726882584.57207: done running TaskExecutor() for managed_node3/TASK: Get stat for interface lsr27 [0affc7ec-ae25-af1a-5b92-00000000021e] 15621 1726882584.57211: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000021e 15621 1726882584.57483: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000021e 15621 1726882584.57486: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882582.3799071, "block_size": 4096, "blocks": 0, "ctime": 1726882582.3799071, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38370, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726882582.3799071, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15621 1726882584.57727: no more pending results, returning what we have 15621 1726882584.57731: results queue empty 15621 1726882584.57732: checking for any_errors_fatal 15621 1726882584.57734: done checking for any_errors_fatal 15621 1726882584.57735: checking for max_fail_percentage 15621 1726882584.57736: done checking for max_fail_percentage 15621 1726882584.57738: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.57739: done checking to see if all hosts have failed 15621 1726882584.57740: getting the remaining hosts for this loop 15621 1726882584.57741: done getting the remaining hosts for this loop 15621 1726882584.57746: getting the next task for host managed_node3 15621 1726882584.57755: done getting next task for host managed_node3 15621 1726882584.57758: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15621 1726882584.57761: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.57765: getting variables 15621 1726882584.57767: in VariableManager get_vars() 15621 1726882584.57806: Calling all_inventory to load vars for managed_node3 15621 1726882584.57809: Calling groups_inventory to load vars for managed_node3 15621 1726882584.57812: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.58135: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.58139: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.58150: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.58629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.58987: done with get_vars() 15621 1726882584.58999: done getting variables 15621 1726882584.59216: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15621 1726882584.59550: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:36:24 -0400 (0:00:00.382) 0:00:16.676 ****** 15621 1726882584.59700: entering _queue_task() for managed_node3/assert 15621 1726882584.59702: Creating lock for assert 15621 1726882584.60430: worker is 1 (out of 1 available) 15621 1726882584.60448: exiting _queue_task() for managed_node3/assert 15621 1726882584.60460: done queuing things up, now waiting for results queue to drain 15621 1726882584.60461: waiting for pending results... 15621 1726882584.61143: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'lsr27' 15621 1726882584.61316: in run() - task 0affc7ec-ae25-af1a-5b92-0000000001d4 15621 1726882584.61320: variable 'ansible_search_path' from source: unknown 15621 1726882584.61372: variable 'ansible_search_path' from source: unknown 15621 1726882584.61377: calling self._execute() 15621 1726882584.61455: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.61461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.61475: variable 'omit' from source: magic vars 15621 1726882584.62632: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.62636: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.62639: variable 'omit' from source: magic vars 15621 1726882584.62642: variable 'omit' from source: magic vars 15621 1726882584.62695: variable 'interface' from source: set_fact 15621 1726882584.62714: variable 'omit' from source: magic vars 15621 1726882584.62995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882584.62999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882584.63019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882584.63040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.63054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.63088: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882584.63092: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.63095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.63508: Set connection var ansible_connection to ssh 15621 1726882584.63511: Set connection var ansible_shell_executable to /bin/sh 15621 1726882584.63514: Set connection var ansible_timeout to 10 15621 1726882584.63516: Set connection var ansible_shell_type to sh 15621 1726882584.63519: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882584.63521: Set connection var ansible_pipelining to False 15621 1726882584.63526: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.63528: variable 'ansible_connection' from source: unknown 15621 1726882584.63531: variable 'ansible_module_compression' from source: unknown 15621 1726882584.63534: variable 'ansible_shell_type' from source: unknown 15621 1726882584.63537: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.63539: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.63541: variable 'ansible_pipelining' from source: unknown 15621 1726882584.63543: variable 'ansible_timeout' from source: unknown 15621 1726882584.63545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.63881: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882584.63885: variable 'omit' from source: magic vars 15621 1726882584.63888: starting attempt loop 15621 1726882584.63891: running the handler 15621 1726882584.64176: variable 'interface_stat' from source: set_fact 15621 1726882584.64206: Evaluated conditional (interface_stat.stat.exists): True 15621 1726882584.64209: handler run complete 15621 1726882584.64212: attempt loop complete, returning result 15621 1726882584.64214: _execute() done 15621 1726882584.64217: dumping result to json 15621 1726882584.64430: done dumping result, returning 15621 1726882584.64434: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'lsr27' [0affc7ec-ae25-af1a-5b92-0000000001d4] 15621 1726882584.64437: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000001d4 15621 1726882584.64663: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000001d4 15621 1726882584.64667: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15621 1726882584.64784: no more pending results, returning what we have 15621 1726882584.64787: results queue empty 15621 1726882584.64789: checking for any_errors_fatal 15621 1726882584.64796: done checking for any_errors_fatal 15621 1726882584.64797: checking for max_fail_percentage 15621 1726882584.64798: done checking for max_fail_percentage 15621 1726882584.64799: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.64800: done checking to see if all hosts have failed 15621 1726882584.64801: getting the remaining hosts for this loop 15621 1726882584.64802: done getting the remaining hosts for this loop 15621 1726882584.64807: getting the next task for host managed_node3 15621 1726882584.64815: done getting next task for host managed_node3 15621 1726882584.64817: ^ task is: TASK: meta (flush_handlers) 15621 1726882584.64818: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.64831: getting variables 15621 1726882584.64833: in VariableManager get_vars() 15621 1726882584.64863: Calling all_inventory to load vars for managed_node3 15621 1726882584.64866: Calling groups_inventory to load vars for managed_node3 15621 1726882584.64869: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.64884: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.64887: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.64890: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.65804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.66446: done with get_vars() 15621 1726882584.66457: done getting variables 15621 1726882584.66704: in VariableManager get_vars() 15621 1726882584.66715: Calling all_inventory to load vars for managed_node3 15621 1726882584.66940: Calling groups_inventory to load vars for managed_node3 15621 1726882584.66944: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.66950: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.66952: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.66956: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.67389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.67725: done with get_vars() 15621 1726882584.67737: done queuing things up, now waiting for results queue to drain 15621 1726882584.67739: results queue empty 15621 1726882584.67740: checking for any_errors_fatal 15621 1726882584.67743: done checking for any_errors_fatal 15621 1726882584.67743: checking for max_fail_percentage 15621 1726882584.67744: done checking for max_fail_percentage 15621 1726882584.67745: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.67745: done checking to see if all hosts have failed 15621 1726882584.67750: getting the remaining hosts for this loop 15621 1726882584.67751: done getting the remaining hosts for this loop 15621 1726882584.67753: getting the next task for host managed_node3 15621 1726882584.67757: done getting next task for host managed_node3 15621 1726882584.67758: ^ task is: TASK: meta (flush_handlers) 15621 1726882584.67760: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.67762: getting variables 15621 1726882584.67763: in VariableManager get_vars() 15621 1726882584.67772: Calling all_inventory to load vars for managed_node3 15621 1726882584.67774: Calling groups_inventory to load vars for managed_node3 15621 1726882584.67776: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.67781: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.67783: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.67787: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.67941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.68324: done with get_vars() 15621 1726882584.68333: done getting variables 15621 1726882584.68382: in VariableManager get_vars() 15621 1726882584.68391: Calling all_inventory to load vars for managed_node3 15621 1726882584.68393: Calling groups_inventory to load vars for managed_node3 15621 1726882584.68396: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.68400: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.68403: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.68406: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.68546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.68938: done with get_vars() 15621 1726882584.68950: done queuing things up, now waiting for results queue to drain 15621 1726882584.68952: results queue empty 15621 1726882584.68952: checking for any_errors_fatal 15621 1726882584.68954: done checking for any_errors_fatal 15621 1726882584.68954: checking for max_fail_percentage 15621 1726882584.68955: done checking for max_fail_percentage 15621 1726882584.68956: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.68957: done checking to see if all hosts have failed 15621 1726882584.68958: getting the remaining hosts for this loop 15621 1726882584.68959: done getting the remaining hosts for this loop 15621 1726882584.68961: getting the next task for host managed_node3 15621 1726882584.68964: done getting next task for host managed_node3 15621 1726882584.68965: ^ task is: None 15621 1726882584.68967: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.68968: done queuing things up, now waiting for results queue to drain 15621 1726882584.68969: results queue empty 15621 1726882584.68969: checking for any_errors_fatal 15621 1726882584.68970: done checking for any_errors_fatal 15621 1726882584.68971: checking for max_fail_percentage 15621 1726882584.68972: done checking for max_fail_percentage 15621 1726882584.68972: checking to see if all hosts have failed and the running result is not ok 15621 1726882584.68973: done checking to see if all hosts have failed 15621 1726882584.68974: getting the next task for host managed_node3 15621 1726882584.68977: done getting next task for host managed_node3 15621 1726882584.68977: ^ task is: None 15621 1726882584.68979: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.69338: in VariableManager get_vars() 15621 1726882584.69360: done with get_vars() 15621 1726882584.69366: in VariableManager get_vars() 15621 1726882584.69379: done with get_vars() 15621 1726882584.69383: variable 'omit' from source: magic vars 15621 1726882584.69415: in VariableManager get_vars() 15621 1726882584.69634: done with get_vars() 15621 1726882584.69659: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 15621 1726882584.70979: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882584.71003: getting the remaining hosts for this loop 15621 1726882584.71005: done getting the remaining hosts for this loop 15621 1726882584.71008: getting the next task for host managed_node3 15621 1726882584.71011: done getting next task for host managed_node3 15621 1726882584.71013: ^ task is: TASK: Gathering Facts 15621 1726882584.71015: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882584.71017: getting variables 15621 1726882584.71018: in VariableManager get_vars() 15621 1726882584.71031: Calling all_inventory to load vars for managed_node3 15621 1726882584.71033: Calling groups_inventory to load vars for managed_node3 15621 1726882584.71036: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882584.71041: Calling all_plugins_play to load vars for managed_node3 15621 1726882584.71044: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882584.71047: Calling groups_plugins_play to load vars for managed_node3 15621 1726882584.71191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882584.71389: done with get_vars() 15621 1726882584.71398: done getting variables 15621 1726882584.71445: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 21:36:24 -0400 (0:00:00.117) 0:00:16.793 ****** 15621 1726882584.71470: entering _queue_task() for managed_node3/gather_facts 15621 1726882584.71952: worker is 1 (out of 1 available) 15621 1726882584.71962: exiting _queue_task() for managed_node3/gather_facts 15621 1726882584.71973: done queuing things up, now waiting for results queue to drain 15621 1726882584.71975: waiting for pending results... 15621 1726882584.72063: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882584.72202: in run() - task 0affc7ec-ae25-af1a-5b92-000000000237 15621 1726882584.72207: variable 'ansible_search_path' from source: unknown 15621 1726882584.72229: calling self._execute() 15621 1726882584.72317: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.72419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.72424: variable 'omit' from source: magic vars 15621 1726882584.72734: variable 'ansible_distribution_major_version' from source: facts 15621 1726882584.72756: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882584.72765: variable 'omit' from source: magic vars 15621 1726882584.72797: variable 'omit' from source: magic vars 15621 1726882584.72840: variable 'omit' from source: magic vars 15621 1726882584.72889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882584.72933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882584.72962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882584.72991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.73010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882584.73047: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882584.73056: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.73063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.73178: Set connection var ansible_connection to ssh 15621 1726882584.73227: Set connection var ansible_shell_executable to /bin/sh 15621 1726882584.73231: Set connection var ansible_timeout to 10 15621 1726882584.73233: Set connection var ansible_shell_type to sh 15621 1726882584.73235: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882584.73237: Set connection var ansible_pipelining to False 15621 1726882584.73262: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.73270: variable 'ansible_connection' from source: unknown 15621 1726882584.73277: variable 'ansible_module_compression' from source: unknown 15621 1726882584.73283: variable 'ansible_shell_type' from source: unknown 15621 1726882584.73292: variable 'ansible_shell_executable' from source: unknown 15621 1726882584.73301: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882584.73309: variable 'ansible_pipelining' from source: unknown 15621 1726882584.73408: variable 'ansible_timeout' from source: unknown 15621 1726882584.73413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882584.73570: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882584.73589: variable 'omit' from source: magic vars 15621 1726882584.73601: starting attempt loop 15621 1726882584.73608: running the handler 15621 1726882584.73639: variable 'ansible_facts' from source: unknown 15621 1726882584.73664: _low_level_execute_command(): starting 15621 1726882584.73677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882584.74519: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.74566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.74740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.74844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.76568: stdout chunk (state=3): >>>/root <<< 15621 1726882584.76741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.76795: stderr chunk (state=3): >>><<< 15621 1726882584.76805: stdout chunk (state=3): >>><<< 15621 1726882584.76839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.76877: _low_level_execute_command(): starting 15621 1726882584.76881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821 `" && echo ansible-tmp-1726882584.7684596-16250-178474325492821="` echo /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821 `" ) && sleep 0' 15621 1726882584.77589: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.77593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.77596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.77660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.77664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.77697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.77837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.79809: stdout chunk (state=3): >>>ansible-tmp-1726882584.7684596-16250-178474325492821=/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821 <<< 15621 1726882584.80432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.80437: stdout chunk (state=3): >>><<< 15621 1726882584.80439: stderr chunk (state=3): >>><<< 15621 1726882584.80443: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882584.7684596-16250-178474325492821=/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.80445: variable 'ansible_module_compression' from source: unknown 15621 1726882584.80447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882584.80450: variable 'ansible_facts' from source: unknown 15621 1726882584.80785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py 15621 1726882584.81119: Sending initial data 15621 1726882584.81125: Sent initial data (154 bytes) 15621 1726882584.81802: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882584.81812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.81826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.81840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882584.81852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882584.81860: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882584.81939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.81960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.81974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.81988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.82112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.83834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882584.83938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882584.84095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpss3r7cd1 /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py <<< 15621 1726882584.84099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py" <<< 15621 1726882584.84162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpss3r7cd1" to remote "/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py" <<< 15621 1726882584.86104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.86140: stderr chunk (state=3): >>><<< 15621 1726882584.86147: stdout chunk (state=3): >>><<< 15621 1726882584.86263: done transferring module to remote 15621 1726882584.86266: _low_level_execute_command(): starting 15621 1726882584.86269: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/ /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py && sleep 0' 15621 1726882584.86845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882584.86858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882584.86873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882584.86894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882584.86910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882584.86966: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.87020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.87046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882584.87071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.87190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882584.89048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882584.89110: stderr chunk (state=3): >>><<< 15621 1726882584.89127: stdout chunk (state=3): >>><<< 15621 1726882584.89157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882584.89245: _low_level_execute_command(): starting 15621 1726882584.89249: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/AnsiballZ_setup.py && sleep 0' 15621 1726882584.89837: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.89862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882584.89921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882584.89979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882584.90031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882584.90123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882586.77383: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_loadavg": {"1m": 0.796875, "5m": 0.65576171875, "15m": 0.326171875}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_<<< 15621 1726882586.77390: stdout chunk (state=3): >>>64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.444080Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626444080", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3432, "used": 284}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 730, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384340480, "block_size": 4096, "block_total": 64483404, "block_available": 61373130, "block_used": 3110274, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882586.79486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882586.79554: stderr chunk (state=3): >>><<< 15621 1726882586.79557: stdout chunk (state=3): >>><<< 15621 1726882586.79580: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_loadavg": {"1m": 0.796875, "5m": 0.65576171875, "15m": 0.326171875}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.444080Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626444080", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3432, "used": 284}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 730, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384340480, "block_size": 4096, "block_total": 64483404, "block_available": 61373130, "block_used": 3110274, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882586.79824: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882586.79844: _low_level_execute_command(): starting 15621 1726882586.79847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882584.7684596-16250-178474325492821/ > /dev/null 2>&1 && sleep 0' 15621 1726882586.80396: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882586.80399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882586.80402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882586.80404: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882586.80406: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882586.80465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882586.80471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882586.80479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882586.80557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882586.82495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882586.82548: stderr chunk (state=3): >>><<< 15621 1726882586.82552: stdout chunk (state=3): >>><<< 15621 1726882586.82564: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882586.82575: handler run complete 15621 1726882586.82658: variable 'ansible_facts' from source: unknown 15621 1726882586.82731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.82934: variable 'ansible_facts' from source: unknown 15621 1726882586.82988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.83075: attempt loop complete, returning result 15621 1726882586.83079: _execute() done 15621 1726882586.83081: dumping result to json 15621 1726882586.83099: done dumping result, returning 15621 1726882586.83107: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-000000000237] 15621 1726882586.83113: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000237 ok: [managed_node3] 15621 1726882586.83586: no more pending results, returning what we have 15621 1726882586.83594: results queue empty 15621 1726882586.83595: checking for any_errors_fatal 15621 1726882586.83596: done checking for any_errors_fatal 15621 1726882586.83596: checking for max_fail_percentage 15621 1726882586.83597: done checking for max_fail_percentage 15621 1726882586.83598: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.83599: done checking to see if all hosts have failed 15621 1726882586.83599: getting the remaining hosts for this loop 15621 1726882586.83600: done getting the remaining hosts for this loop 15621 1726882586.83603: getting the next task for host managed_node3 15621 1726882586.83606: done getting next task for host managed_node3 15621 1726882586.83608: ^ task is: TASK: meta (flush_handlers) 15621 1726882586.83609: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.83612: getting variables 15621 1726882586.83613: in VariableManager get_vars() 15621 1726882586.83638: Calling all_inventory to load vars for managed_node3 15621 1726882586.83640: Calling groups_inventory to load vars for managed_node3 15621 1726882586.83641: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.83651: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.83653: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.83656: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.83774: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000237 15621 1726882586.83778: WORKER PROCESS EXITING 15621 1726882586.83789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.83920: done with get_vars() 15621 1726882586.83931: done getting variables 15621 1726882586.83985: in VariableManager get_vars() 15621 1726882586.83993: Calling all_inventory to load vars for managed_node3 15621 1726882586.83995: Calling groups_inventory to load vars for managed_node3 15621 1726882586.83996: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.84000: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.84001: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.84003: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.84098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.84219: done with get_vars() 15621 1726882586.84231: done queuing things up, now waiting for results queue to drain 15621 1726882586.84233: results queue empty 15621 1726882586.84233: checking for any_errors_fatal 15621 1726882586.84235: done checking for any_errors_fatal 15621 1726882586.84239: checking for max_fail_percentage 15621 1726882586.84239: done checking for max_fail_percentage 15621 1726882586.84240: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.84240: done checking to see if all hosts have failed 15621 1726882586.84241: getting the remaining hosts for this loop 15621 1726882586.84242: done getting the remaining hosts for this loop 15621 1726882586.84244: getting the next task for host managed_node3 15621 1726882586.84248: done getting next task for host managed_node3 15621 1726882586.84250: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882586.84251: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.84260: getting variables 15621 1726882586.84261: in VariableManager get_vars() 15621 1726882586.84272: Calling all_inventory to load vars for managed_node3 15621 1726882586.84274: Calling groups_inventory to load vars for managed_node3 15621 1726882586.84275: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.84279: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.84280: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.84282: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.84372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.84665: done with get_vars() 15621 1726882586.84674: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:26 -0400 (0:00:02.132) 0:00:18.926 ****** 15621 1726882586.84731: entering _queue_task() for managed_node3/include_tasks 15621 1726882586.84946: worker is 1 (out of 1 available) 15621 1726882586.84959: exiting _queue_task() for managed_node3/include_tasks 15621 1726882586.84970: done queuing things up, now waiting for results queue to drain 15621 1726882586.84974: waiting for pending results... 15621 1726882586.85145: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882586.85217: in run() - task 0affc7ec-ae25-af1a-5b92-000000000019 15621 1726882586.85230: variable 'ansible_search_path' from source: unknown 15621 1726882586.85233: variable 'ansible_search_path' from source: unknown 15621 1726882586.85264: calling self._execute() 15621 1726882586.85333: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.85336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.85346: variable 'omit' from source: magic vars 15621 1726882586.85643: variable 'ansible_distribution_major_version' from source: facts 15621 1726882586.85648: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882586.85658: _execute() done 15621 1726882586.85661: dumping result to json 15621 1726882586.85664: done dumping result, returning 15621 1726882586.85673: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-af1a-5b92-000000000019] 15621 1726882586.85676: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000019 15621 1726882586.85768: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000019 15621 1726882586.85774: WORKER PROCESS EXITING 15621 1726882586.85817: no more pending results, returning what we have 15621 1726882586.85821: in VariableManager get_vars() 15621 1726882586.85860: Calling all_inventory to load vars for managed_node3 15621 1726882586.85862: Calling groups_inventory to load vars for managed_node3 15621 1726882586.85865: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.85876: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.85879: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.85882: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.86010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.86149: done with get_vars() 15621 1726882586.86155: variable 'ansible_search_path' from source: unknown 15621 1726882586.86155: variable 'ansible_search_path' from source: unknown 15621 1726882586.86176: we have included files to process 15621 1726882586.86176: generating all_blocks data 15621 1726882586.86177: done generating all_blocks data 15621 1726882586.86178: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882586.86178: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882586.86180: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882586.86699: done processing included file 15621 1726882586.86700: iterating over new_blocks loaded from include file 15621 1726882586.86701: in VariableManager get_vars() 15621 1726882586.86714: done with get_vars() 15621 1726882586.86715: filtering new block on tags 15621 1726882586.86728: done filtering new block on tags 15621 1726882586.86730: in VariableManager get_vars() 15621 1726882586.86742: done with get_vars() 15621 1726882586.86743: filtering new block on tags 15621 1726882586.86755: done filtering new block on tags 15621 1726882586.86756: in VariableManager get_vars() 15621 1726882586.86767: done with get_vars() 15621 1726882586.86768: filtering new block on tags 15621 1726882586.86779: done filtering new block on tags 15621 1726882586.86780: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15621 1726882586.86785: extending task lists for all hosts with included blocks 15621 1726882586.87018: done extending task lists 15621 1726882586.87019: done processing included files 15621 1726882586.87020: results queue empty 15621 1726882586.87020: checking for any_errors_fatal 15621 1726882586.87023: done checking for any_errors_fatal 15621 1726882586.87023: checking for max_fail_percentage 15621 1726882586.87024: done checking for max_fail_percentage 15621 1726882586.87025: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.87025: done checking to see if all hosts have failed 15621 1726882586.87026: getting the remaining hosts for this loop 15621 1726882586.87027: done getting the remaining hosts for this loop 15621 1726882586.87028: getting the next task for host managed_node3 15621 1726882586.87031: done getting next task for host managed_node3 15621 1726882586.87033: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882586.87034: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.87041: getting variables 15621 1726882586.87041: in VariableManager get_vars() 15621 1726882586.87051: Calling all_inventory to load vars for managed_node3 15621 1726882586.87052: Calling groups_inventory to load vars for managed_node3 15621 1726882586.87053: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.87057: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.87059: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.87060: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.87168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.87297: done with get_vars() 15621 1726882586.87303: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:26 -0400 (0:00:00.026) 0:00:18.952 ****** 15621 1726882586.87357: entering _queue_task() for managed_node3/setup 15621 1726882586.87577: worker is 1 (out of 1 available) 15621 1726882586.87591: exiting _queue_task() for managed_node3/setup 15621 1726882586.87602: done queuing things up, now waiting for results queue to drain 15621 1726882586.87604: waiting for pending results... 15621 1726882586.87783: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882586.87863: in run() - task 0affc7ec-ae25-af1a-5b92-000000000279 15621 1726882586.87877: variable 'ansible_search_path' from source: unknown 15621 1726882586.87881: variable 'ansible_search_path' from source: unknown 15621 1726882586.87914: calling self._execute() 15621 1726882586.87984: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.87989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.87998: variable 'omit' from source: magic vars 15621 1726882586.88288: variable 'ansible_distribution_major_version' from source: facts 15621 1726882586.88298: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882586.88456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882586.90050: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882586.90105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882586.90137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882586.90165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882586.90189: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882586.90257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882586.90281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882586.90300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882586.90331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882586.90347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882586.90390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882586.90407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882586.90426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882586.90459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882586.90471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882586.90589: variable '__network_required_facts' from source: role '' defaults 15621 1726882586.90596: variable 'ansible_facts' from source: unknown 15621 1726882586.90669: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15621 1726882586.90673: when evaluation is False, skipping this task 15621 1726882586.90676: _execute() done 15621 1726882586.90680: dumping result to json 15621 1726882586.90683: done dumping result, returning 15621 1726882586.90689: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-af1a-5b92-000000000279] 15621 1726882586.90694: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000279 15621 1726882586.90785: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000279 15621 1726882586.90789: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882586.90837: no more pending results, returning what we have 15621 1726882586.90840: results queue empty 15621 1726882586.90842: checking for any_errors_fatal 15621 1726882586.90843: done checking for any_errors_fatal 15621 1726882586.90844: checking for max_fail_percentage 15621 1726882586.90845: done checking for max_fail_percentage 15621 1726882586.90846: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.90847: done checking to see if all hosts have failed 15621 1726882586.90847: getting the remaining hosts for this loop 15621 1726882586.90849: done getting the remaining hosts for this loop 15621 1726882586.90853: getting the next task for host managed_node3 15621 1726882586.90861: done getting next task for host managed_node3 15621 1726882586.90865: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882586.90867: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.90882: getting variables 15621 1726882586.90886: in VariableManager get_vars() 15621 1726882586.90926: Calling all_inventory to load vars for managed_node3 15621 1726882586.90929: Calling groups_inventory to load vars for managed_node3 15621 1726882586.90931: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.90941: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.90944: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.90946: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.91101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.91247: done with get_vars() 15621 1726882586.91257: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:26 -0400 (0:00:00.039) 0:00:18.992 ****** 15621 1726882586.91330: entering _queue_task() for managed_node3/stat 15621 1726882586.91559: worker is 1 (out of 1 available) 15621 1726882586.91572: exiting _queue_task() for managed_node3/stat 15621 1726882586.91584: done queuing things up, now waiting for results queue to drain 15621 1726882586.91586: waiting for pending results... 15621 1726882586.91758: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882586.91849: in run() - task 0affc7ec-ae25-af1a-5b92-00000000027b 15621 1726882586.91862: variable 'ansible_search_path' from source: unknown 15621 1726882586.91865: variable 'ansible_search_path' from source: unknown 15621 1726882586.91898: calling self._execute() 15621 1726882586.91966: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.91969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.91981: variable 'omit' from source: magic vars 15621 1726882586.92264: variable 'ansible_distribution_major_version' from source: facts 15621 1726882586.92277: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882586.92459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882586.92657: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882586.92694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882586.92721: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882586.92747: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882586.92823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882586.92842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882586.92861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882586.92883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882586.92952: variable '__network_is_ostree' from source: set_fact 15621 1726882586.92958: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882586.92961: when evaluation is False, skipping this task 15621 1726882586.92964: _execute() done 15621 1726882586.92967: dumping result to json 15621 1726882586.92973: done dumping result, returning 15621 1726882586.92982: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-af1a-5b92-00000000027b] 15621 1726882586.92987: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027b 15621 1726882586.93072: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027b 15621 1726882586.93075: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882586.93129: no more pending results, returning what we have 15621 1726882586.93132: results queue empty 15621 1726882586.93133: checking for any_errors_fatal 15621 1726882586.93138: done checking for any_errors_fatal 15621 1726882586.93139: checking for max_fail_percentage 15621 1726882586.93141: done checking for max_fail_percentage 15621 1726882586.93141: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.93142: done checking to see if all hosts have failed 15621 1726882586.93143: getting the remaining hosts for this loop 15621 1726882586.93144: done getting the remaining hosts for this loop 15621 1726882586.93148: getting the next task for host managed_node3 15621 1726882586.93153: done getting next task for host managed_node3 15621 1726882586.93157: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882586.93160: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.93173: getting variables 15621 1726882586.93174: in VariableManager get_vars() 15621 1726882586.93207: Calling all_inventory to load vars for managed_node3 15621 1726882586.93210: Calling groups_inventory to load vars for managed_node3 15621 1726882586.93212: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.93221: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.93226: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.93230: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.93396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.93528: done with get_vars() 15621 1726882586.93536: done getting variables 15621 1726882586.93580: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:26 -0400 (0:00:00.022) 0:00:19.015 ****** 15621 1726882586.93607: entering _queue_task() for managed_node3/set_fact 15621 1726882586.93813: worker is 1 (out of 1 available) 15621 1726882586.93828: exiting _queue_task() for managed_node3/set_fact 15621 1726882586.93840: done queuing things up, now waiting for results queue to drain 15621 1726882586.93842: waiting for pending results... 15621 1726882586.94010: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882586.94094: in run() - task 0affc7ec-ae25-af1a-5b92-00000000027c 15621 1726882586.94107: variable 'ansible_search_path' from source: unknown 15621 1726882586.94110: variable 'ansible_search_path' from source: unknown 15621 1726882586.94144: calling self._execute() 15621 1726882586.94210: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.94216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.94225: variable 'omit' from source: magic vars 15621 1726882586.94510: variable 'ansible_distribution_major_version' from source: facts 15621 1726882586.94521: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882586.94646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882586.94851: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882586.94883: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882586.94909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882586.94937: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882586.95030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882586.95052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882586.95078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882586.95095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882586.95162: variable '__network_is_ostree' from source: set_fact 15621 1726882586.95177: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882586.95182: when evaluation is False, skipping this task 15621 1726882586.95184: _execute() done 15621 1726882586.95187: dumping result to json 15621 1726882586.95190: done dumping result, returning 15621 1726882586.95193: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-af1a-5b92-00000000027c] 15621 1726882586.95195: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027c 15621 1726882586.95282: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027c 15621 1726882586.95285: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882586.95337: no more pending results, returning what we have 15621 1726882586.95340: results queue empty 15621 1726882586.95342: checking for any_errors_fatal 15621 1726882586.95347: done checking for any_errors_fatal 15621 1726882586.95348: checking for max_fail_percentage 15621 1726882586.95350: done checking for max_fail_percentage 15621 1726882586.95350: checking to see if all hosts have failed and the running result is not ok 15621 1726882586.95352: done checking to see if all hosts have failed 15621 1726882586.95352: getting the remaining hosts for this loop 15621 1726882586.95354: done getting the remaining hosts for this loop 15621 1726882586.95358: getting the next task for host managed_node3 15621 1726882586.95365: done getting next task for host managed_node3 15621 1726882586.95369: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882586.95374: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882586.95387: getting variables 15621 1726882586.95388: in VariableManager get_vars() 15621 1726882586.95420: Calling all_inventory to load vars for managed_node3 15621 1726882586.95432: Calling groups_inventory to load vars for managed_node3 15621 1726882586.95435: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882586.95444: Calling all_plugins_play to load vars for managed_node3 15621 1726882586.95447: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882586.95449: Calling groups_plugins_play to load vars for managed_node3 15621 1726882586.95579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882586.95716: done with get_vars() 15621 1726882586.95726: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:26 -0400 (0:00:00.021) 0:00:19.037 ****** 15621 1726882586.95797: entering _queue_task() for managed_node3/service_facts 15621 1726882586.95799: Creating lock for service_facts 15621 1726882586.96027: worker is 1 (out of 1 available) 15621 1726882586.96042: exiting _queue_task() for managed_node3/service_facts 15621 1726882586.96056: done queuing things up, now waiting for results queue to drain 15621 1726882586.96058: waiting for pending results... 15621 1726882586.96220: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882586.96300: in run() - task 0affc7ec-ae25-af1a-5b92-00000000027e 15621 1726882586.96317: variable 'ansible_search_path' from source: unknown 15621 1726882586.96321: variable 'ansible_search_path' from source: unknown 15621 1726882586.96354: calling self._execute() 15621 1726882586.96427: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.96434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.96443: variable 'omit' from source: magic vars 15621 1726882586.96789: variable 'ansible_distribution_major_version' from source: facts 15621 1726882586.96798: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882586.96804: variable 'omit' from source: magic vars 15621 1726882586.96847: variable 'omit' from source: magic vars 15621 1726882586.96873: variable 'omit' from source: magic vars 15621 1726882586.96909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882586.96953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882586.96959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882586.96976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882586.96987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882586.97012: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882586.97015: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.97017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.97099: Set connection var ansible_connection to ssh 15621 1726882586.97106: Set connection var ansible_shell_executable to /bin/sh 15621 1726882586.97112: Set connection var ansible_timeout to 10 15621 1726882586.97115: Set connection var ansible_shell_type to sh 15621 1726882586.97120: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882586.97127: Set connection var ansible_pipelining to False 15621 1726882586.97146: variable 'ansible_shell_executable' from source: unknown 15621 1726882586.97151: variable 'ansible_connection' from source: unknown 15621 1726882586.97154: variable 'ansible_module_compression' from source: unknown 15621 1726882586.97157: variable 'ansible_shell_type' from source: unknown 15621 1726882586.97159: variable 'ansible_shell_executable' from source: unknown 15621 1726882586.97163: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882586.97165: variable 'ansible_pipelining' from source: unknown 15621 1726882586.97168: variable 'ansible_timeout' from source: unknown 15621 1726882586.97170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882586.97329: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882586.97338: variable 'omit' from source: magic vars 15621 1726882586.97342: starting attempt loop 15621 1726882586.97345: running the handler 15621 1726882586.97357: _low_level_execute_command(): starting 15621 1726882586.97364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882586.97914: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882586.97918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882586.97921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882586.97925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882586.97980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882586.97984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882586.98079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882586.99849: stdout chunk (state=3): >>>/root <<< 15621 1726882586.99958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882587.00020: stderr chunk (state=3): >>><<< 15621 1726882587.00025: stdout chunk (state=3): >>><<< 15621 1726882587.00048: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882587.00061: _low_level_execute_command(): starting 15621 1726882587.00067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710 `" && echo ansible-tmp-1726882587.0004742-16316-228740316422710="` echo /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710 `" ) && sleep 0' 15621 1726882587.00576: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.00580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.00583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882587.00595: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.00642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882587.00645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882587.00652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882587.00737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882587.02713: stdout chunk (state=3): >>>ansible-tmp-1726882587.0004742-16316-228740316422710=/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710 <<< 15621 1726882587.02829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882587.02887: stderr chunk (state=3): >>><<< 15621 1726882587.02890: stdout chunk (state=3): >>><<< 15621 1726882587.02904: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882587.0004742-16316-228740316422710=/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882587.02950: variable 'ansible_module_compression' from source: unknown 15621 1726882587.02991: ANSIBALLZ: Using lock for service_facts 15621 1726882587.02995: ANSIBALLZ: Acquiring lock 15621 1726882587.02997: ANSIBALLZ: Lock acquired: 140146888391616 15621 1726882587.03000: ANSIBALLZ: Creating module 15621 1726882587.12461: ANSIBALLZ: Writing module into payload 15621 1726882587.12535: ANSIBALLZ: Writing module 15621 1726882587.12555: ANSIBALLZ: Renaming module 15621 1726882587.12562: ANSIBALLZ: Done creating module 15621 1726882587.12581: variable 'ansible_facts' from source: unknown 15621 1726882587.12629: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py 15621 1726882587.12742: Sending initial data 15621 1726882587.12745: Sent initial data (162 bytes) 15621 1726882587.13257: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.13261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.13263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882587.13265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.13268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.13338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882587.13341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882587.13343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882587.13420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882587.15014: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882587.15095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882587.15181: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp754v8lib /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py <<< 15621 1726882587.15184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py" <<< 15621 1726882587.15265: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp754v8lib" to remote "/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py" <<< 15621 1726882587.15274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py" <<< 15621 1726882587.15994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882587.16072: stderr chunk (state=3): >>><<< 15621 1726882587.16078: stdout chunk (state=3): >>><<< 15621 1726882587.16097: done transferring module to remote 15621 1726882587.16109: _low_level_execute_command(): starting 15621 1726882587.16112: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/ /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py && sleep 0' 15621 1726882587.16602: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.16605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.16608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.16615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.16659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882587.16665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882587.16752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882587.18551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882587.18601: stderr chunk (state=3): >>><<< 15621 1726882587.18605: stdout chunk (state=3): >>><<< 15621 1726882587.18620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882587.18625: _low_level_execute_command(): starting 15621 1726882587.18630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/AnsiballZ_service_facts.py && sleep 0' 15621 1726882587.19105: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882587.19109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882587.19111: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.19113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882587.19116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882587.19172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882587.19175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882587.19260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882589.32835: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-<<< 15621 1726882589.32849: stdout chunk (state=3): >>>utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15621 1726882589.34418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882589.34508: stderr chunk (state=3): >>><<< 15621 1726882589.34512: stdout chunk (state=3): >>><<< 15621 1726882589.34537: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882589.36528: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882589.36532: _low_level_execute_command(): starting 15621 1726882589.36535: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882587.0004742-16316-228740316422710/ > /dev/null 2>&1 && sleep 0' 15621 1726882589.37314: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882589.37339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882589.37571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882589.37700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882589.37726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882589.37838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882589.39739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882589.40144: stderr chunk (state=3): >>><<< 15621 1726882589.40147: stdout chunk (state=3): >>><<< 15621 1726882589.40164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882589.40172: handler run complete 15621 1726882589.40837: variable 'ansible_facts' from source: unknown 15621 1726882589.44832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882589.45986: variable 'ansible_facts' from source: unknown 15621 1726882589.46328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882589.46849: attempt loop complete, returning result 15621 1726882589.47128: _execute() done 15621 1726882589.47132: dumping result to json 15621 1726882589.47134: done dumping result, returning 15621 1726882589.47137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-af1a-5b92-00000000027e] 15621 1726882589.47146: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027e 15621 1726882589.49063: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027e 15621 1726882589.49068: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882589.49128: no more pending results, returning what we have 15621 1726882589.49131: results queue empty 15621 1726882589.49133: checking for any_errors_fatal 15621 1726882589.49137: done checking for any_errors_fatal 15621 1726882589.49138: checking for max_fail_percentage 15621 1726882589.49139: done checking for max_fail_percentage 15621 1726882589.49140: checking to see if all hosts have failed and the running result is not ok 15621 1726882589.49141: done checking to see if all hosts have failed 15621 1726882589.49142: getting the remaining hosts for this loop 15621 1726882589.49143: done getting the remaining hosts for this loop 15621 1726882589.49147: getting the next task for host managed_node3 15621 1726882589.49152: done getting next task for host managed_node3 15621 1726882589.49155: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882589.49157: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882589.49165: getting variables 15621 1726882589.49167: in VariableManager get_vars() 15621 1726882589.49198: Calling all_inventory to load vars for managed_node3 15621 1726882589.49201: Calling groups_inventory to load vars for managed_node3 15621 1726882589.49203: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882589.49213: Calling all_plugins_play to load vars for managed_node3 15621 1726882589.49216: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882589.49330: Calling groups_plugins_play to load vars for managed_node3 15621 1726882589.50461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882589.52308: done with get_vars() 15621 1726882589.52651: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:29 -0400 (0:00:02.571) 0:00:21.608 ****** 15621 1726882589.52904: entering _queue_task() for managed_node3/package_facts 15621 1726882589.52905: Creating lock for package_facts 15621 1726882589.53688: worker is 1 (out of 1 available) 15621 1726882589.53702: exiting _queue_task() for managed_node3/package_facts 15621 1726882589.53714: done queuing things up, now waiting for results queue to drain 15621 1726882589.53716: waiting for pending results... 15621 1726882589.54539: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882589.54732: in run() - task 0affc7ec-ae25-af1a-5b92-00000000027f 15621 1726882589.54736: variable 'ansible_search_path' from source: unknown 15621 1726882589.54739: variable 'ansible_search_path' from source: unknown 15621 1726882589.54743: calling self._execute() 15621 1726882589.54746: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882589.54749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882589.54751: variable 'omit' from source: magic vars 15621 1726882589.55511: variable 'ansible_distribution_major_version' from source: facts 15621 1726882589.55744: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882589.55756: variable 'omit' from source: magic vars 15621 1726882589.55826: variable 'omit' from source: magic vars 15621 1726882589.55874: variable 'omit' from source: magic vars 15621 1726882589.56327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882589.56331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882589.56334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882589.56336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882589.56341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882589.56344: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882589.56348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882589.56350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882589.56785: Set connection var ansible_connection to ssh 15621 1726882589.57128: Set connection var ansible_shell_executable to /bin/sh 15621 1726882589.57131: Set connection var ansible_timeout to 10 15621 1726882589.57133: Set connection var ansible_shell_type to sh 15621 1726882589.57136: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882589.57138: Set connection var ansible_pipelining to False 15621 1726882589.57140: variable 'ansible_shell_executable' from source: unknown 15621 1726882589.57142: variable 'ansible_connection' from source: unknown 15621 1726882589.57145: variable 'ansible_module_compression' from source: unknown 15621 1726882589.57146: variable 'ansible_shell_type' from source: unknown 15621 1726882589.57148: variable 'ansible_shell_executable' from source: unknown 15621 1726882589.57150: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882589.57152: variable 'ansible_pipelining' from source: unknown 15621 1726882589.57155: variable 'ansible_timeout' from source: unknown 15621 1726882589.57158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882589.57536: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882589.57553: variable 'omit' from source: magic vars 15621 1726882589.57563: starting attempt loop 15621 1726882589.57573: running the handler 15621 1726882589.57594: _low_level_execute_command(): starting 15621 1726882589.57607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882589.59038: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882589.59245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882589.59420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882589.59506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882589.61209: stdout chunk (state=3): >>>/root <<< 15621 1726882589.61385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882589.61401: stdout chunk (state=3): >>><<< 15621 1726882589.61416: stderr chunk (state=3): >>><<< 15621 1726882589.61443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882589.61466: _low_level_execute_command(): starting 15621 1726882589.61484: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219 `" && echo ansible-tmp-1726882589.6145034-16393-17488716523219="` echo /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219 `" ) && sleep 0' 15621 1726882589.62154: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882589.62242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882589.62278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882589.62299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882589.62339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882589.62418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882589.64388: stdout chunk (state=3): >>>ansible-tmp-1726882589.6145034-16393-17488716523219=/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219 <<< 15621 1726882589.64570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882589.64604: stderr chunk (state=3): >>><<< 15621 1726882589.64608: stdout chunk (state=3): >>><<< 15621 1726882589.64829: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882589.6145034-16393-17488716523219=/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882589.64833: variable 'ansible_module_compression' from source: unknown 15621 1726882589.64835: ANSIBALLZ: Using lock for package_facts 15621 1726882589.64838: ANSIBALLZ: Acquiring lock 15621 1726882589.64841: ANSIBALLZ: Lock acquired: 140146883419920 15621 1726882589.64843: ANSIBALLZ: Creating module 15621 1726882590.27866: ANSIBALLZ: Writing module into payload 15621 1726882590.28029: ANSIBALLZ: Writing module 15621 1726882590.28262: ANSIBALLZ: Renaming module 15621 1726882590.28269: ANSIBALLZ: Done creating module 15621 1726882590.28313: variable 'ansible_facts' from source: unknown 15621 1726882590.28749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py 15621 1726882590.28810: Sending initial data 15621 1726882590.28814: Sent initial data (161 bytes) 15621 1726882590.29557: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882590.29561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882590.29564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882590.29567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882590.29569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882590.29590: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882590.29653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882590.29762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882590.29932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882590.31619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882590.31746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882590.31841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7psompxl /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py <<< 15621 1726882590.31849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py" <<< 15621 1726882590.31980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7psompxl" to remote "/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py" <<< 15621 1726882590.35567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882590.35574: stdout chunk (state=3): >>><<< 15621 1726882590.35577: stderr chunk (state=3): >>><<< 15621 1726882590.35579: done transferring module to remote 15621 1726882590.35582: _low_level_execute_command(): starting 15621 1726882590.35584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/ /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py && sleep 0' 15621 1726882590.36091: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882590.36107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882590.36111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882590.36341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882590.36344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882590.36346: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882590.36348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882590.36350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882590.36352: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882590.36354: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15621 1726882590.36360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882590.36362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882590.36364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882590.36366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882590.36403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882590.38348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882590.38628: stderr chunk (state=3): >>><<< 15621 1726882590.38632: stdout chunk (state=3): >>><<< 15621 1726882590.38635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882590.38637: _low_level_execute_command(): starting 15621 1726882590.38640: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/AnsiballZ_package_facts.py && sleep 0' 15621 1726882590.39812: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882590.39824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882590.39839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882590.39916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882590.39930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882590.40018: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882590.40139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882590.40148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882590.40284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882591.02292: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 15621 1726882591.02303: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 15621 1726882591.02501: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "<<< 15621 1726882591.02518: stdout chunk (state=3): >>>libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "<<< 15621 1726882591.02636: stdout chunk (state=3): >>>arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15621 1726882591.04496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882591.04500: stdout chunk (state=3): >>><<< 15621 1726882591.04509: stderr chunk (state=3): >>><<< 15621 1726882591.04607: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882591.14324: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882591.14329: _low_level_execute_command(): starting 15621 1726882591.14331: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882589.6145034-16393-17488716523219/ > /dev/null 2>&1 && sleep 0' 15621 1726882591.14903: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882591.14912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882591.14926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882591.15086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882591.15089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882591.15092: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882591.15094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882591.15097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882591.15199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882591.15293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882591.17438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882591.17830: stdout chunk (state=3): >>><<< 15621 1726882591.17834: stderr chunk (state=3): >>><<< 15621 1726882591.17836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882591.17838: handler run complete 15621 1726882591.19017: variable 'ansible_facts' from source: unknown 15621 1726882591.19661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.23499: variable 'ansible_facts' from source: unknown 15621 1726882591.24938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.27486: attempt loop complete, returning result 15621 1726882591.27928: _execute() done 15621 1726882591.27932: dumping result to json 15621 1726882591.28220: done dumping result, returning 15621 1726882591.28239: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-af1a-5b92-00000000027f] 15621 1726882591.28248: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027f 15621 1726882591.31936: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000027f 15621 1726882591.31940: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882591.32032: no more pending results, returning what we have 15621 1726882591.32034: results queue empty 15621 1726882591.32035: checking for any_errors_fatal 15621 1726882591.32039: done checking for any_errors_fatal 15621 1726882591.32040: checking for max_fail_percentage 15621 1726882591.32045: done checking for max_fail_percentage 15621 1726882591.32046: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.32047: done checking to see if all hosts have failed 15621 1726882591.32048: getting the remaining hosts for this loop 15621 1726882591.32049: done getting the remaining hosts for this loop 15621 1726882591.32052: getting the next task for host managed_node3 15621 1726882591.32059: done getting next task for host managed_node3 15621 1726882591.32063: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882591.32064: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.32076: getting variables 15621 1726882591.32078: in VariableManager get_vars() 15621 1726882591.32107: Calling all_inventory to load vars for managed_node3 15621 1726882591.32110: Calling groups_inventory to load vars for managed_node3 15621 1726882591.32112: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.32121: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.32126: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.32129: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.34341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.36519: done with get_vars() 15621 1726882591.36547: done getting variables 15621 1726882591.36616: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:31 -0400 (0:00:01.837) 0:00:23.445 ****** 15621 1726882591.36649: entering _queue_task() for managed_node3/debug 15621 1726882591.36984: worker is 1 (out of 1 available) 15621 1726882591.36999: exiting _queue_task() for managed_node3/debug 15621 1726882591.37012: done queuing things up, now waiting for results queue to drain 15621 1726882591.37014: waiting for pending results... 15621 1726882591.37290: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882591.37368: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001a 15621 1726882591.37397: variable 'ansible_search_path' from source: unknown 15621 1726882591.37406: variable 'ansible_search_path' from source: unknown 15621 1726882591.37457: calling self._execute() 15621 1726882591.37558: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.37603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.37607: variable 'omit' from source: magic vars 15621 1726882591.37969: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.37986: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.37997: variable 'omit' from source: magic vars 15621 1726882591.38040: variable 'omit' from source: magic vars 15621 1726882591.38149: variable 'network_provider' from source: set_fact 15621 1726882591.38227: variable 'omit' from source: magic vars 15621 1726882591.38230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882591.38254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882591.38279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882591.38303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882591.38323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882591.38367: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882591.38377: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.38386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.38500: Set connection var ansible_connection to ssh 15621 1726882591.38516: Set connection var ansible_shell_executable to /bin/sh 15621 1726882591.38584: Set connection var ansible_timeout to 10 15621 1726882591.38588: Set connection var ansible_shell_type to sh 15621 1726882591.38591: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882591.38593: Set connection var ansible_pipelining to False 15621 1726882591.38595: variable 'ansible_shell_executable' from source: unknown 15621 1726882591.38598: variable 'ansible_connection' from source: unknown 15621 1726882591.38600: variable 'ansible_module_compression' from source: unknown 15621 1726882591.38602: variable 'ansible_shell_type' from source: unknown 15621 1726882591.38606: variable 'ansible_shell_executable' from source: unknown 15621 1726882591.38615: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.38626: variable 'ansible_pipelining' from source: unknown 15621 1726882591.38634: variable 'ansible_timeout' from source: unknown 15621 1726882591.38643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.38809: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882591.38919: variable 'omit' from source: magic vars 15621 1726882591.38922: starting attempt loop 15621 1726882591.38927: running the handler 15621 1726882591.38930: handler run complete 15621 1726882591.38932: attempt loop complete, returning result 15621 1726882591.38934: _execute() done 15621 1726882591.38937: dumping result to json 15621 1726882591.38939: done dumping result, returning 15621 1726882591.38942: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-af1a-5b92-00000000001a] 15621 1726882591.38951: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001a ok: [managed_node3] => {} MSG: Using network provider: nm 15621 1726882591.39140: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001a 15621 1726882591.39143: WORKER PROCESS EXITING 15621 1726882591.39156: no more pending results, returning what we have 15621 1726882591.39159: results queue empty 15621 1726882591.39161: checking for any_errors_fatal 15621 1726882591.39170: done checking for any_errors_fatal 15621 1726882591.39171: checking for max_fail_percentage 15621 1726882591.39172: done checking for max_fail_percentage 15621 1726882591.39173: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.39175: done checking to see if all hosts have failed 15621 1726882591.39175: getting the remaining hosts for this loop 15621 1726882591.39177: done getting the remaining hosts for this loop 15621 1726882591.39182: getting the next task for host managed_node3 15621 1726882591.39188: done getting next task for host managed_node3 15621 1726882591.39193: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882591.39195: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.39207: getting variables 15621 1726882591.39209: in VariableManager get_vars() 15621 1726882591.39264: Calling all_inventory to load vars for managed_node3 15621 1726882591.39267: Calling groups_inventory to load vars for managed_node3 15621 1726882591.39270: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.39283: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.39286: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.39290: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.41054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.43040: done with get_vars() 15621 1726882591.43070: done getting variables 15621 1726882591.43140: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:31 -0400 (0:00:00.065) 0:00:23.510 ****** 15621 1726882591.43175: entering _queue_task() for managed_node3/fail 15621 1726882591.43485: worker is 1 (out of 1 available) 15621 1726882591.43500: exiting _queue_task() for managed_node3/fail 15621 1726882591.43513: done queuing things up, now waiting for results queue to drain 15621 1726882591.43515: waiting for pending results... 15621 1726882591.43799: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882591.43920: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001b 15621 1726882591.43951: variable 'ansible_search_path' from source: unknown 15621 1726882591.43959: variable 'ansible_search_path' from source: unknown 15621 1726882591.44000: calling self._execute() 15621 1726882591.44129: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.44133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.44137: variable 'omit' from source: magic vars 15621 1726882591.44500: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.44517: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.44649: variable 'network_state' from source: role '' defaults 15621 1726882591.44663: Evaluated conditional (network_state != {}): False 15621 1726882591.44675: when evaluation is False, skipping this task 15621 1726882591.44682: _execute() done 15621 1726882591.44689: dumping result to json 15621 1726882591.44695: done dumping result, returning 15621 1726882591.44706: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-af1a-5b92-00000000001b] 15621 1726882591.44728: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001b skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882591.44874: no more pending results, returning what we have 15621 1726882591.44879: results queue empty 15621 1726882591.44880: checking for any_errors_fatal 15621 1726882591.44888: done checking for any_errors_fatal 15621 1726882591.44889: checking for max_fail_percentage 15621 1726882591.44890: done checking for max_fail_percentage 15621 1726882591.44891: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.44893: done checking to see if all hosts have failed 15621 1726882591.44893: getting the remaining hosts for this loop 15621 1726882591.44895: done getting the remaining hosts for this loop 15621 1726882591.44899: getting the next task for host managed_node3 15621 1726882591.44905: done getting next task for host managed_node3 15621 1726882591.44909: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882591.44912: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.44931: getting variables 15621 1726882591.44932: in VariableManager get_vars() 15621 1726882591.44970: Calling all_inventory to load vars for managed_node3 15621 1726882591.44972: Calling groups_inventory to load vars for managed_node3 15621 1726882591.44974: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.44988: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.44991: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.44994: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.45738: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001b 15621 1726882591.45742: WORKER PROCESS EXITING 15621 1726882591.46986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.48985: done with get_vars() 15621 1726882591.49010: done getting variables 15621 1726882591.49073: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:31 -0400 (0:00:00.059) 0:00:23.570 ****** 15621 1726882591.49102: entering _queue_task() for managed_node3/fail 15621 1726882591.49384: worker is 1 (out of 1 available) 15621 1726882591.49399: exiting _queue_task() for managed_node3/fail 15621 1726882591.49413: done queuing things up, now waiting for results queue to drain 15621 1726882591.49415: waiting for pending results... 15621 1726882591.49715: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882591.50031: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001c 15621 1726882591.50035: variable 'ansible_search_path' from source: unknown 15621 1726882591.50039: variable 'ansible_search_path' from source: unknown 15621 1726882591.50043: calling self._execute() 15621 1726882591.50045: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.50048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.50051: variable 'omit' from source: magic vars 15621 1726882591.50445: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.50468: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.50602: variable 'network_state' from source: role '' defaults 15621 1726882591.50620: Evaluated conditional (network_state != {}): False 15621 1726882591.50631: when evaluation is False, skipping this task 15621 1726882591.50639: _execute() done 15621 1726882591.50647: dumping result to json 15621 1726882591.50656: done dumping result, returning 15621 1726882591.50667: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-af1a-5b92-00000000001c] 15621 1726882591.50680: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001c 15621 1726882591.50932: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001c 15621 1726882591.50936: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882591.50985: no more pending results, returning what we have 15621 1726882591.50989: results queue empty 15621 1726882591.50990: checking for any_errors_fatal 15621 1726882591.50998: done checking for any_errors_fatal 15621 1726882591.50999: checking for max_fail_percentage 15621 1726882591.51000: done checking for max_fail_percentage 15621 1726882591.51002: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.51003: done checking to see if all hosts have failed 15621 1726882591.51003: getting the remaining hosts for this loop 15621 1726882591.51005: done getting the remaining hosts for this loop 15621 1726882591.51009: getting the next task for host managed_node3 15621 1726882591.51015: done getting next task for host managed_node3 15621 1726882591.51020: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882591.51024: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.51041: getting variables 15621 1726882591.51043: in VariableManager get_vars() 15621 1726882591.51081: Calling all_inventory to load vars for managed_node3 15621 1726882591.51084: Calling groups_inventory to load vars for managed_node3 15621 1726882591.51086: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.51099: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.51102: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.51105: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.52835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.55014: done with get_vars() 15621 1726882591.55042: done getting variables 15621 1726882591.55106: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:31 -0400 (0:00:00.060) 0:00:23.630 ****** 15621 1726882591.55137: entering _queue_task() for managed_node3/fail 15621 1726882591.55477: worker is 1 (out of 1 available) 15621 1726882591.55493: exiting _queue_task() for managed_node3/fail 15621 1726882591.55508: done queuing things up, now waiting for results queue to drain 15621 1726882591.55510: waiting for pending results... 15621 1726882591.55845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882591.55969: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001d 15621 1726882591.55996: variable 'ansible_search_path' from source: unknown 15621 1726882591.56004: variable 'ansible_search_path' from source: unknown 15621 1726882591.56060: calling self._execute() 15621 1726882591.56163: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.56181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.56197: variable 'omit' from source: magic vars 15621 1726882591.56620: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.56641: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.56923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882591.59415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882591.59504: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882591.59554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882591.59598: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882591.59630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882591.59972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.60089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.60117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.60175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.60196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.60379: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.60383: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15621 1726882591.60479: variable 'ansible_distribution' from source: facts 15621 1726882591.60490: variable '__network_rh_distros' from source: role '' defaults 15621 1726882591.60514: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15621 1726882591.60525: when evaluation is False, skipping this task 15621 1726882591.60533: _execute() done 15621 1726882591.60541: dumping result to json 15621 1726882591.60550: done dumping result, returning 15621 1726882591.60610: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-af1a-5b92-00000000001d] 15621 1726882591.60614: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001d 15621 1726882591.60694: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001d 15621 1726882591.60697: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15621 1726882591.60765: no more pending results, returning what we have 15621 1726882591.60769: results queue empty 15621 1726882591.60773: checking for any_errors_fatal 15621 1726882591.60777: done checking for any_errors_fatal 15621 1726882591.60778: checking for max_fail_percentage 15621 1726882591.60779: done checking for max_fail_percentage 15621 1726882591.60780: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.60781: done checking to see if all hosts have failed 15621 1726882591.60782: getting the remaining hosts for this loop 15621 1726882591.60783: done getting the remaining hosts for this loop 15621 1726882591.60788: getting the next task for host managed_node3 15621 1726882591.60794: done getting next task for host managed_node3 15621 1726882591.60799: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882591.60800: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.60815: getting variables 15621 1726882591.60817: in VariableManager get_vars() 15621 1726882591.60862: Calling all_inventory to load vars for managed_node3 15621 1726882591.60865: Calling groups_inventory to load vars for managed_node3 15621 1726882591.60867: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.60883: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.60886: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.60890: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.62867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.64950: done with get_vars() 15621 1726882591.64980: done getting variables 15621 1726882591.65082: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:31 -0400 (0:00:00.099) 0:00:23.730 ****** 15621 1726882591.65111: entering _queue_task() for managed_node3/dnf 15621 1726882591.65655: worker is 1 (out of 1 available) 15621 1726882591.65663: exiting _queue_task() for managed_node3/dnf 15621 1726882591.65675: done queuing things up, now waiting for results queue to drain 15621 1726882591.65677: waiting for pending results... 15621 1726882591.65807: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882591.65856: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001e 15621 1726882591.65901: variable 'ansible_search_path' from source: unknown 15621 1726882591.65905: variable 'ansible_search_path' from source: unknown 15621 1726882591.65941: calling self._execute() 15621 1726882591.66119: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.66125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.66128: variable 'omit' from source: magic vars 15621 1726882591.66489: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.66508: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.66738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882591.69160: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882591.69632: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882591.69679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882591.69725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882591.69758: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882591.69856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.69940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.69944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.69984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.70005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.70152: variable 'ansible_distribution' from source: facts 15621 1726882591.70167: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.70184: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15621 1726882591.70316: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882591.70597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.70601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.70603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.70605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.70608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.70649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.70679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.70712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.70763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.70783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.70836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.70864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.70897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.70947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.70965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.71140: variable 'network_connections' from source: play vars 15621 1726882591.71156: variable 'interface' from source: set_fact 15621 1726882591.71233: variable 'interface' from source: set_fact 15621 1726882591.71250: variable 'interface' from source: set_fact 15621 1726882591.71313: variable 'interface' from source: set_fact 15621 1726882591.71402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882591.71592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882591.71640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882591.71727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882591.71730: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882591.71768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882591.71804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882591.71847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.71884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882591.71956: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882591.76628: variable 'network_connections' from source: play vars 15621 1726882591.76634: variable 'interface' from source: set_fact 15621 1726882591.76655: variable 'interface' from source: set_fact 15621 1726882591.76667: variable 'interface' from source: set_fact 15621 1726882591.76743: variable 'interface' from source: set_fact 15621 1726882591.76789: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882591.76797: when evaluation is False, skipping this task 15621 1726882591.76805: _execute() done 15621 1726882591.76812: dumping result to json 15621 1726882591.76819: done dumping result, returning 15621 1726882591.76837: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-00000000001e] 15621 1726882591.76849: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001e 15621 1726882591.77151: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001e 15621 1726882591.77155: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882591.77210: no more pending results, returning what we have 15621 1726882591.77214: results queue empty 15621 1726882591.77215: checking for any_errors_fatal 15621 1726882591.77221: done checking for any_errors_fatal 15621 1726882591.77224: checking for max_fail_percentage 15621 1726882591.77226: done checking for max_fail_percentage 15621 1726882591.77227: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.77228: done checking to see if all hosts have failed 15621 1726882591.77229: getting the remaining hosts for this loop 15621 1726882591.77230: done getting the remaining hosts for this loop 15621 1726882591.77234: getting the next task for host managed_node3 15621 1726882591.77240: done getting next task for host managed_node3 15621 1726882591.77244: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882591.77246: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.77261: getting variables 15621 1726882591.77262: in VariableManager get_vars() 15621 1726882591.77306: Calling all_inventory to load vars for managed_node3 15621 1726882591.77309: Calling groups_inventory to load vars for managed_node3 15621 1726882591.77312: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.77530: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.77535: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.77539: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.83001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.85095: done with get_vars() 15621 1726882591.85127: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882591.85200: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:31 -0400 (0:00:00.201) 0:00:23.931 ****** 15621 1726882591.85238: entering _queue_task() for managed_node3/yum 15621 1726882591.85240: Creating lock for yum 15621 1726882591.85601: worker is 1 (out of 1 available) 15621 1726882591.85614: exiting _queue_task() for managed_node3/yum 15621 1726882591.85628: done queuing things up, now waiting for results queue to drain 15621 1726882591.85630: waiting for pending results... 15621 1726882591.86044: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882591.86057: in run() - task 0affc7ec-ae25-af1a-5b92-00000000001f 15621 1726882591.86082: variable 'ansible_search_path' from source: unknown 15621 1726882591.86089: variable 'ansible_search_path' from source: unknown 15621 1726882591.86136: calling self._execute() 15621 1726882591.86240: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.86256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.86268: variable 'omit' from source: magic vars 15621 1726882591.86691: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.86710: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.86899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882591.89436: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882591.89517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882591.89629: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882591.89633: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882591.89644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882591.89735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882591.89780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882591.89811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882591.89863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882591.89885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882591.90028: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.90031: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15621 1726882591.90034: when evaluation is False, skipping this task 15621 1726882591.90036: _execute() done 15621 1726882591.90039: dumping result to json 15621 1726882591.90040: done dumping result, returning 15621 1726882591.90043: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-00000000001f] 15621 1726882591.90051: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001f skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15621 1726882591.90383: no more pending results, returning what we have 15621 1726882591.90387: results queue empty 15621 1726882591.90388: checking for any_errors_fatal 15621 1726882591.90398: done checking for any_errors_fatal 15621 1726882591.90399: checking for max_fail_percentage 15621 1726882591.90400: done checking for max_fail_percentage 15621 1726882591.90401: checking to see if all hosts have failed and the running result is not ok 15621 1726882591.90403: done checking to see if all hosts have failed 15621 1726882591.90403: getting the remaining hosts for this loop 15621 1726882591.90405: done getting the remaining hosts for this loop 15621 1726882591.90410: getting the next task for host managed_node3 15621 1726882591.90416: done getting next task for host managed_node3 15621 1726882591.90429: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882591.90431: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882591.90447: getting variables 15621 1726882591.90449: in VariableManager get_vars() 15621 1726882591.90495: Calling all_inventory to load vars for managed_node3 15621 1726882591.90498: Calling groups_inventory to load vars for managed_node3 15621 1726882591.90501: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882591.90514: Calling all_plugins_play to load vars for managed_node3 15621 1726882591.90518: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882591.90521: Calling groups_plugins_play to load vars for managed_node3 15621 1726882591.90732: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000001f 15621 1726882591.90735: WORKER PROCESS EXITING 15621 1726882591.92496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882591.95298: done with get_vars() 15621 1726882591.95327: done getting variables 15621 1726882591.95465: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:31 -0400 (0:00:00.102) 0:00:24.034 ****** 15621 1726882591.95508: entering _queue_task() for managed_node3/fail 15621 1726882591.95918: worker is 1 (out of 1 available) 15621 1726882591.95950: exiting _queue_task() for managed_node3/fail 15621 1726882591.95970: done queuing things up, now waiting for results queue to drain 15621 1726882591.95972: waiting for pending results... 15621 1726882591.96351: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882591.96501: in run() - task 0affc7ec-ae25-af1a-5b92-000000000020 15621 1726882591.96525: variable 'ansible_search_path' from source: unknown 15621 1726882591.96538: variable 'ansible_search_path' from source: unknown 15621 1726882591.96608: calling self._execute() 15621 1726882591.96714: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882591.96796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882591.96800: variable 'omit' from source: magic vars 15621 1726882591.97216: variable 'ansible_distribution_major_version' from source: facts 15621 1726882591.97243: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882591.97383: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882591.97635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882592.00302: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882592.00399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882592.00628: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882592.00631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882592.00634: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882592.00637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.00673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.00711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.00768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.00803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.00877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.00911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.00944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.00999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.01039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.01103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.01137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.01169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.01225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.01298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.02011: variable 'network_connections' from source: play vars 15621 1726882592.02129: variable 'interface' from source: set_fact 15621 1726882592.02220: variable 'interface' from source: set_fact 15621 1726882592.02381: variable 'interface' from source: set_fact 15621 1726882592.02514: variable 'interface' from source: set_fact 15621 1726882592.02652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882592.03392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882592.03686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882592.03712: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882592.03781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882592.03912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882592.04001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882592.04235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.04239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882592.04397: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882592.05225: variable 'network_connections' from source: play vars 15621 1726882592.05337: variable 'interface' from source: set_fact 15621 1726882592.05456: variable 'interface' from source: set_fact 15621 1726882592.05540: variable 'interface' from source: set_fact 15621 1726882592.05728: variable 'interface' from source: set_fact 15621 1726882592.05732: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882592.05735: when evaluation is False, skipping this task 15621 1726882592.05737: _execute() done 15621 1726882592.05740: dumping result to json 15621 1726882592.05742: done dumping result, returning 15621 1726882592.05845: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000020] 15621 1726882592.05865: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000020 15621 1726882592.06212: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000020 15621 1726882592.06215: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882592.06283: no more pending results, returning what we have 15621 1726882592.06287: results queue empty 15621 1726882592.06289: checking for any_errors_fatal 15621 1726882592.06296: done checking for any_errors_fatal 15621 1726882592.06296: checking for max_fail_percentage 15621 1726882592.06298: done checking for max_fail_percentage 15621 1726882592.06299: checking to see if all hosts have failed and the running result is not ok 15621 1726882592.06300: done checking to see if all hosts have failed 15621 1726882592.06302: getting the remaining hosts for this loop 15621 1726882592.06304: done getting the remaining hosts for this loop 15621 1726882592.06308: getting the next task for host managed_node3 15621 1726882592.06314: done getting next task for host managed_node3 15621 1726882592.06319: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15621 1726882592.06321: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882592.06338: getting variables 15621 1726882592.06340: in VariableManager get_vars() 15621 1726882592.06386: Calling all_inventory to load vars for managed_node3 15621 1726882592.06389: Calling groups_inventory to load vars for managed_node3 15621 1726882592.06391: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882592.06402: Calling all_plugins_play to load vars for managed_node3 15621 1726882592.06405: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882592.06409: Calling groups_plugins_play to load vars for managed_node3 15621 1726882592.09300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882592.13198: done with get_vars() 15621 1726882592.13434: done getting variables 15621 1726882592.13503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:32 -0400 (0:00:00.180) 0:00:24.214 ****** 15621 1726882592.13538: entering _queue_task() for managed_node3/package 15621 1726882592.13886: worker is 1 (out of 1 available) 15621 1726882592.13901: exiting _queue_task() for managed_node3/package 15621 1726882592.13914: done queuing things up, now waiting for results queue to drain 15621 1726882592.13916: waiting for pending results... 15621 1726882592.14216: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15621 1726882592.14346: in run() - task 0affc7ec-ae25-af1a-5b92-000000000021 15621 1726882592.14373: variable 'ansible_search_path' from source: unknown 15621 1726882592.14384: variable 'ansible_search_path' from source: unknown 15621 1726882592.14433: calling self._execute() 15621 1726882592.14543: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882592.14557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882592.14580: variable 'omit' from source: magic vars 15621 1726882592.15002: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.15025: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882592.15530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882592.15996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882592.16230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882592.16274: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882592.16376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882592.16666: variable 'network_packages' from source: role '' defaults 15621 1726882592.16993: variable '__network_provider_setup' from source: role '' defaults 15621 1726882592.17010: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882592.17148: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882592.17162: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882592.17325: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882592.17767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882592.23149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882592.23440: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882592.23444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882592.23447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882592.23551: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882592.23680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.23760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.23986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.23990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.23993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.24204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.24207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.24210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.24341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.24363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.24848: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882592.25005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.25037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.25074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.25128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.25148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.25255: variable 'ansible_python' from source: facts 15621 1726882592.25291: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882592.25388: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882592.25486: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882592.25645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.25679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.25715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.25768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.25791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.25851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.25956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.25960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.25970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.25993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.26157: variable 'network_connections' from source: play vars 15621 1726882592.26177: variable 'interface' from source: set_fact 15621 1726882592.26291: variable 'interface' from source: set_fact 15621 1726882592.26308: variable 'interface' from source: set_fact 15621 1726882592.26426: variable 'interface' from source: set_fact 15621 1726882592.26513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882592.26613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882592.26616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.26632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882592.26683: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882592.26995: variable 'network_connections' from source: play vars 15621 1726882592.27037: variable 'interface' from source: set_fact 15621 1726882592.27269: variable 'interface' from source: set_fact 15621 1726882592.27289: variable 'interface' from source: set_fact 15621 1726882592.27545: variable 'interface' from source: set_fact 15621 1726882592.27830: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882592.27834: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882592.28647: variable 'network_connections' from source: play vars 15621 1726882592.28660: variable 'interface' from source: set_fact 15621 1726882592.28886: variable 'interface' from source: set_fact 15621 1726882592.28889: variable 'interface' from source: set_fact 15621 1726882592.28934: variable 'interface' from source: set_fact 15621 1726882592.29020: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882592.29189: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882592.30019: variable 'network_connections' from source: play vars 15621 1726882592.30034: variable 'interface' from source: set_fact 15621 1726882592.30107: variable 'interface' from source: set_fact 15621 1726882592.30141: variable 'interface' from source: set_fact 15621 1726882592.30214: variable 'interface' from source: set_fact 15621 1726882592.30415: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882592.30525: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882592.30727: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882592.30760: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882592.31289: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882592.32468: variable 'network_connections' from source: play vars 15621 1726882592.32483: variable 'interface' from source: set_fact 15621 1726882592.32830: variable 'interface' from source: set_fact 15621 1726882592.32834: variable 'interface' from source: set_fact 15621 1726882592.32837: variable 'interface' from source: set_fact 15621 1726882592.32851: variable 'ansible_distribution' from source: facts 15621 1726882592.32860: variable '__network_rh_distros' from source: role '' defaults 15621 1726882592.32873: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.32903: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882592.33283: variable 'ansible_distribution' from source: facts 15621 1726882592.33292: variable '__network_rh_distros' from source: role '' defaults 15621 1726882592.33302: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.33313: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882592.33675: variable 'ansible_distribution' from source: facts 15621 1726882592.33921: variable '__network_rh_distros' from source: role '' defaults 15621 1726882592.33924: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.33929: variable 'network_provider' from source: set_fact 15621 1726882592.33931: variable 'ansible_facts' from source: unknown 15621 1726882592.35669: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15621 1726882592.35795: when evaluation is False, skipping this task 15621 1726882592.35820: _execute() done 15621 1726882592.35851: dumping result to json 15621 1726882592.35860: done dumping result, returning 15621 1726882592.35886: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-af1a-5b92-000000000021] 15621 1726882592.35924: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000021 15621 1726882592.36429: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000021 15621 1726882592.36433: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15621 1726882592.36488: no more pending results, returning what we have 15621 1726882592.36492: results queue empty 15621 1726882592.36494: checking for any_errors_fatal 15621 1726882592.36501: done checking for any_errors_fatal 15621 1726882592.36502: checking for max_fail_percentage 15621 1726882592.36503: done checking for max_fail_percentage 15621 1726882592.36504: checking to see if all hosts have failed and the running result is not ok 15621 1726882592.36505: done checking to see if all hosts have failed 15621 1726882592.36506: getting the remaining hosts for this loop 15621 1726882592.36508: done getting the remaining hosts for this loop 15621 1726882592.36512: getting the next task for host managed_node3 15621 1726882592.36518: done getting next task for host managed_node3 15621 1726882592.36525: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882592.36527: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882592.36542: getting variables 15621 1726882592.36544: in VariableManager get_vars() 15621 1726882592.36590: Calling all_inventory to load vars for managed_node3 15621 1726882592.36593: Calling groups_inventory to load vars for managed_node3 15621 1726882592.36596: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882592.36612: Calling all_plugins_play to load vars for managed_node3 15621 1726882592.36615: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882592.36619: Calling groups_plugins_play to load vars for managed_node3 15621 1726882592.40747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882592.45355: done with get_vars() 15621 1726882592.45389: done getting variables 15621 1726882592.45570: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:32 -0400 (0:00:00.320) 0:00:24.535 ****** 15621 1726882592.45603: entering _queue_task() for managed_node3/package 15621 1726882592.46350: worker is 1 (out of 1 available) 15621 1726882592.46365: exiting _queue_task() for managed_node3/package 15621 1726882592.46378: done queuing things up, now waiting for results queue to drain 15621 1726882592.46494: waiting for pending results... 15621 1726882592.47213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882592.47531: in run() - task 0affc7ec-ae25-af1a-5b92-000000000022 15621 1726882592.47928: variable 'ansible_search_path' from source: unknown 15621 1726882592.47932: variable 'ansible_search_path' from source: unknown 15621 1726882592.47935: calling self._execute() 15621 1726882592.47937: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882592.47941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882592.47944: variable 'omit' from source: magic vars 15621 1726882592.49120: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.49728: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882592.49732: variable 'network_state' from source: role '' defaults 15621 1726882592.49735: Evaluated conditional (network_state != {}): False 15621 1726882592.49738: when evaluation is False, skipping this task 15621 1726882592.49742: _execute() done 15621 1726882592.49745: dumping result to json 15621 1726882592.49748: done dumping result, returning 15621 1726882592.49752: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000022] 15621 1726882592.49756: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882592.50104: no more pending results, returning what we have 15621 1726882592.50109: results queue empty 15621 1726882592.50110: checking for any_errors_fatal 15621 1726882592.50119: done checking for any_errors_fatal 15621 1726882592.50120: checking for max_fail_percentage 15621 1726882592.50124: done checking for max_fail_percentage 15621 1726882592.50126: checking to see if all hosts have failed and the running result is not ok 15621 1726882592.50127: done checking to see if all hosts have failed 15621 1726882592.50128: getting the remaining hosts for this loop 15621 1726882592.50130: done getting the remaining hosts for this loop 15621 1726882592.50134: getting the next task for host managed_node3 15621 1726882592.50142: done getting next task for host managed_node3 15621 1726882592.50146: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882592.50148: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882592.50167: getting variables 15621 1726882592.50169: in VariableManager get_vars() 15621 1726882592.50215: Calling all_inventory to load vars for managed_node3 15621 1726882592.50219: Calling groups_inventory to load vars for managed_node3 15621 1726882592.50524: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882592.50541: Calling all_plugins_play to load vars for managed_node3 15621 1726882592.50545: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882592.50548: Calling groups_plugins_play to load vars for managed_node3 15621 1726882592.51530: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000022 15621 1726882592.51535: WORKER PROCESS EXITING 15621 1726882592.54327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882592.59780: done with get_vars() 15621 1726882592.59807: done getting variables 15621 1726882592.60180: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:32 -0400 (0:00:00.146) 0:00:24.681 ****** 15621 1726882592.60213: entering _queue_task() for managed_node3/package 15621 1726882592.61173: worker is 1 (out of 1 available) 15621 1726882592.61187: exiting _queue_task() for managed_node3/package 15621 1726882592.61219: done queuing things up, now waiting for results queue to drain 15621 1726882592.61223: waiting for pending results... 15621 1726882592.61584: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882592.61814: in run() - task 0affc7ec-ae25-af1a-5b92-000000000023 15621 1726882592.61839: variable 'ansible_search_path' from source: unknown 15621 1726882592.61958: variable 'ansible_search_path' from source: unknown 15621 1726882592.61986: calling self._execute() 15621 1726882592.62257: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882592.62261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882592.62264: variable 'omit' from source: magic vars 15621 1726882592.63069: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.63429: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882592.63481: variable 'network_state' from source: role '' defaults 15621 1726882592.63498: Evaluated conditional (network_state != {}): False 15621 1726882592.63509: when evaluation is False, skipping this task 15621 1726882592.63517: _execute() done 15621 1726882592.63529: dumping result to json 15621 1726882592.63543: done dumping result, returning 15621 1726882592.63554: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000023] 15621 1726882592.63564: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000023 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882592.63741: no more pending results, returning what we have 15621 1726882592.63745: results queue empty 15621 1726882592.63747: checking for any_errors_fatal 15621 1726882592.63757: done checking for any_errors_fatal 15621 1726882592.63758: checking for max_fail_percentage 15621 1726882592.63759: done checking for max_fail_percentage 15621 1726882592.63761: checking to see if all hosts have failed and the running result is not ok 15621 1726882592.63762: done checking to see if all hosts have failed 15621 1726882592.63763: getting the remaining hosts for this loop 15621 1726882592.63765: done getting the remaining hosts for this loop 15621 1726882592.63770: getting the next task for host managed_node3 15621 1726882592.63776: done getting next task for host managed_node3 15621 1726882592.63780: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882592.63782: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882592.63800: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000023 15621 1726882592.63803: WORKER PROCESS EXITING 15621 1726882592.63928: getting variables 15621 1726882592.63931: in VariableManager get_vars() 15621 1726882592.63975: Calling all_inventory to load vars for managed_node3 15621 1726882592.63979: Calling groups_inventory to load vars for managed_node3 15621 1726882592.63981: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882592.63996: Calling all_plugins_play to load vars for managed_node3 15621 1726882592.63999: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882592.64003: Calling groups_plugins_play to load vars for managed_node3 15621 1726882592.68003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882592.73350: done with get_vars() 15621 1726882592.73391: done getting variables 15621 1726882592.73674: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:32 -0400 (0:00:00.134) 0:00:24.816 ****** 15621 1726882592.73708: entering _queue_task() for managed_node3/service 15621 1726882592.73710: Creating lock for service 15621 1726882592.74421: worker is 1 (out of 1 available) 15621 1726882592.74538: exiting _queue_task() for managed_node3/service 15621 1726882592.74665: done queuing things up, now waiting for results queue to drain 15621 1726882592.74667: waiting for pending results... 15621 1726882592.75439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882592.75445: in run() - task 0affc7ec-ae25-af1a-5b92-000000000024 15621 1726882592.75449: variable 'ansible_search_path' from source: unknown 15621 1726882592.75453: variable 'ansible_search_path' from source: unknown 15621 1726882592.76027: calling self._execute() 15621 1726882592.76031: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882592.76035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882592.76037: variable 'omit' from source: magic vars 15621 1726882592.77167: variable 'ansible_distribution_major_version' from source: facts 15621 1726882592.77342: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882592.77654: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882592.78528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882592.85389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882592.86930: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882592.86935: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882592.87328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882592.87332: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882592.87335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.87338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.87341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.87773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.87797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.87854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.88054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.88083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.88527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.88530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.88532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882592.88535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882592.88537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.88539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882592.89127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882592.89130: variable 'network_connections' from source: play vars 15621 1726882592.89339: variable 'interface' from source: set_fact 15621 1726882592.89607: variable 'interface' from source: set_fact 15621 1726882592.89626: variable 'interface' from source: set_fact 15621 1726882592.89893: variable 'interface' from source: set_fact 15621 1726882592.89976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882592.90598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882592.90868: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882592.90909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882592.91327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882592.91331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882592.91333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882592.91335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882592.91338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882592.91751: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882592.92429: variable 'network_connections' from source: play vars 15621 1726882592.92441: variable 'interface' from source: set_fact 15621 1726882592.92692: variable 'interface' from source: set_fact 15621 1726882592.93127: variable 'interface' from source: set_fact 15621 1726882592.93130: variable 'interface' from source: set_fact 15621 1726882592.93133: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882592.93135: when evaluation is False, skipping this task 15621 1726882592.93137: _execute() done 15621 1726882592.93139: dumping result to json 15621 1726882592.93142: done dumping result, returning 15621 1726882592.93147: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000024] 15621 1726882592.93158: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000024 15621 1726882592.93242: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000024 15621 1726882592.93245: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882592.93307: no more pending results, returning what we have 15621 1726882592.93313: results queue empty 15621 1726882592.93315: checking for any_errors_fatal 15621 1726882592.93325: done checking for any_errors_fatal 15621 1726882592.93326: checking for max_fail_percentage 15621 1726882592.93328: done checking for max_fail_percentage 15621 1726882592.93329: checking to see if all hosts have failed and the running result is not ok 15621 1726882592.93331: done checking to see if all hosts have failed 15621 1726882592.93331: getting the remaining hosts for this loop 15621 1726882592.93333: done getting the remaining hosts for this loop 15621 1726882592.93337: getting the next task for host managed_node3 15621 1726882592.93345: done getting next task for host managed_node3 15621 1726882592.93349: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882592.93351: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882592.93490: getting variables 15621 1726882592.93493: in VariableManager get_vars() 15621 1726882592.93538: Calling all_inventory to load vars for managed_node3 15621 1726882592.93541: Calling groups_inventory to load vars for managed_node3 15621 1726882592.93543: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882592.93555: Calling all_plugins_play to load vars for managed_node3 15621 1726882592.93558: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882592.93560: Calling groups_plugins_play to load vars for managed_node3 15621 1726882592.97566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882593.02626: done with get_vars() 15621 1726882593.02660: done getting variables 15621 1726882593.02940: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:33 -0400 (0:00:00.292) 0:00:25.108 ****** 15621 1726882593.02977: entering _queue_task() for managed_node3/service 15621 1726882593.03747: worker is 1 (out of 1 available) 15621 1726882593.03761: exiting _queue_task() for managed_node3/service 15621 1726882593.03776: done queuing things up, now waiting for results queue to drain 15621 1726882593.03778: waiting for pending results... 15621 1726882593.04524: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882593.05231: in run() - task 0affc7ec-ae25-af1a-5b92-000000000025 15621 1726882593.05236: variable 'ansible_search_path' from source: unknown 15621 1726882593.05239: variable 'ansible_search_path' from source: unknown 15621 1726882593.05242: calling self._execute() 15621 1726882593.05246: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882593.05249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882593.05253: variable 'omit' from source: magic vars 15621 1726882593.06425: variable 'ansible_distribution_major_version' from source: facts 15621 1726882593.07027: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882593.07030: variable 'network_provider' from source: set_fact 15621 1726882593.07033: variable 'network_state' from source: role '' defaults 15621 1726882593.07628: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15621 1726882593.07633: variable 'omit' from source: magic vars 15621 1726882593.07635: variable 'omit' from source: magic vars 15621 1726882593.07638: variable 'network_service_name' from source: role '' defaults 15621 1726882593.07640: variable 'network_service_name' from source: role '' defaults 15621 1726882593.08096: variable '__network_provider_setup' from source: role '' defaults 15621 1726882593.08106: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882593.08174: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882593.08438: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882593.08507: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882593.09355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882593.15969: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882593.16259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882593.16300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882593.16338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882593.16365: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882593.16658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882593.16692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882593.16719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.16871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882593.16889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882593.17091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882593.17115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882593.17143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.17189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882593.17205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882593.17873: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882593.18326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882593.18330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882593.18354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.18647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882593.18662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882593.18766: variable 'ansible_python' from source: facts 15621 1726882593.18793: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882593.18893: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882593.19113: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882593.19435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882593.19460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882593.19490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.19650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882593.19759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882593.19835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882593.19859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882593.19977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.20038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882593.20058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882593.20348: variable 'network_connections' from source: play vars 15621 1726882593.20356: variable 'interface' from source: set_fact 15621 1726882593.20629: variable 'interface' from source: set_fact 15621 1726882593.20633: variable 'interface' from source: set_fact 15621 1726882593.20662: variable 'interface' from source: set_fact 15621 1726882593.20899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882593.21472: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882593.21527: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882593.21734: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882593.21780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882593.22081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882593.22084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882593.22144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882593.22182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882593.22408: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882593.22783: variable 'network_connections' from source: play vars 15621 1726882593.22789: variable 'interface' from source: set_fact 15621 1726882593.23076: variable 'interface' from source: set_fact 15621 1726882593.23088: variable 'interface' from source: set_fact 15621 1726882593.23327: variable 'interface' from source: set_fact 15621 1726882593.23330: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882593.23555: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882593.24258: variable 'network_connections' from source: play vars 15621 1726882593.24261: variable 'interface' from source: set_fact 15621 1726882593.24439: variable 'interface' from source: set_fact 15621 1726882593.24445: variable 'interface' from source: set_fact 15621 1726882593.24616: variable 'interface' from source: set_fact 15621 1726882593.24644: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882593.25029: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882593.25517: variable 'network_connections' from source: play vars 15621 1726882593.25520: variable 'interface' from source: set_fact 15621 1726882593.26181: variable 'interface' from source: set_fact 15621 1726882593.26187: variable 'interface' from source: set_fact 15621 1726882593.26464: variable 'interface' from source: set_fact 15621 1726882593.26535: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882593.26600: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882593.26607: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882593.26991: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882593.27436: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882593.28595: variable 'network_connections' from source: play vars 15621 1726882593.28599: variable 'interface' from source: set_fact 15621 1726882593.28809: variable 'interface' from source: set_fact 15621 1726882593.28816: variable 'interface' from source: set_fact 15621 1726882593.28912: variable 'interface' from source: set_fact 15621 1726882593.28923: variable 'ansible_distribution' from source: facts 15621 1726882593.28927: variable '__network_rh_distros' from source: role '' defaults 15621 1726882593.28934: variable 'ansible_distribution_major_version' from source: facts 15621 1726882593.29172: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882593.29463: variable 'ansible_distribution' from source: facts 15621 1726882593.29466: variable '__network_rh_distros' from source: role '' defaults 15621 1726882593.29472: variable 'ansible_distribution_major_version' from source: facts 15621 1726882593.29482: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882593.29900: variable 'ansible_distribution' from source: facts 15621 1726882593.29903: variable '__network_rh_distros' from source: role '' defaults 15621 1726882593.29909: variable 'ansible_distribution_major_version' from source: facts 15621 1726882593.30148: variable 'network_provider' from source: set_fact 15621 1726882593.30152: variable 'omit' from source: magic vars 15621 1726882593.30154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882593.30157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882593.30281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882593.30299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882593.30310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882593.30497: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882593.30500: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882593.30503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882593.30861: Set connection var ansible_connection to ssh 15621 1726882593.30871: Set connection var ansible_shell_executable to /bin/sh 15621 1726882593.30882: Set connection var ansible_timeout to 10 15621 1726882593.30885: Set connection var ansible_shell_type to sh 15621 1726882593.30891: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882593.30897: Set connection var ansible_pipelining to False 15621 1726882593.30926: variable 'ansible_shell_executable' from source: unknown 15621 1726882593.30934: variable 'ansible_connection' from source: unknown 15621 1726882593.30937: variable 'ansible_module_compression' from source: unknown 15621 1726882593.30940: variable 'ansible_shell_type' from source: unknown 15621 1726882593.30943: variable 'ansible_shell_executable' from source: unknown 15621 1726882593.30945: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882593.30952: variable 'ansible_pipelining' from source: unknown 15621 1726882593.30955: variable 'ansible_timeout' from source: unknown 15621 1726882593.30957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882593.31071: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882593.31085: variable 'omit' from source: magic vars 15621 1726882593.31091: starting attempt loop 15621 1726882593.31094: running the handler 15621 1726882593.31310: variable 'ansible_facts' from source: unknown 15621 1726882593.33728: _low_level_execute_command(): starting 15621 1726882593.33732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882593.35017: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882593.35309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882593.35314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882593.35317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882593.35391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882593.35414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882593.35528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882593.37308: stdout chunk (state=3): >>>/root <<< 15621 1726882593.37436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882593.37501: stderr chunk (state=3): >>><<< 15621 1726882593.37504: stdout chunk (state=3): >>><<< 15621 1726882593.37650: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882593.37663: _low_level_execute_command(): starting 15621 1726882593.37670: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227 `" && echo ansible-tmp-1726882593.3764896-16559-52205225226227="` echo /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227 `" ) && sleep 0' 15621 1726882593.38805: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882593.39084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882593.39088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882593.39091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882593.39093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882593.39164: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882593.39413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882593.39417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882593.39441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882593.41521: stdout chunk (state=3): >>>ansible-tmp-1726882593.3764896-16559-52205225226227=/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227 <<< 15621 1726882593.41633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882593.41799: stderr chunk (state=3): >>><<< 15621 1726882593.41807: stdout chunk (state=3): >>><<< 15621 1726882593.42030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882593.3764896-16559-52205225226227=/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882593.42033: variable 'ansible_module_compression' from source: unknown 15621 1726882593.42180: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15621 1726882593.42185: ANSIBALLZ: Acquiring lock 15621 1726882593.42188: ANSIBALLZ: Lock acquired: 140146888266560 15621 1726882593.42194: ANSIBALLZ: Creating module 15621 1726882594.17226: ANSIBALLZ: Writing module into payload 15621 1726882594.17844: ANSIBALLZ: Writing module 15621 1726882594.17882: ANSIBALLZ: Renaming module 15621 1726882594.17888: ANSIBALLZ: Done creating module 15621 1726882594.17913: variable 'ansible_facts' from source: unknown 15621 1726882594.18334: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py 15621 1726882594.18666: Sending initial data 15621 1726882594.18670: Sent initial data (155 bytes) 15621 1726882594.20232: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882594.20358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882594.20450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882594.22207: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882594.22283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882594.22393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp58691fua /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py <<< 15621 1726882594.22401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py" <<< 15621 1726882594.22469: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp58691fua" to remote "/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py" <<< 15621 1726882594.25652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882594.25656: stdout chunk (state=3): >>><<< 15621 1726882594.25658: stderr chunk (state=3): >>><<< 15621 1726882594.25661: done transferring module to remote 15621 1726882594.25730: _low_level_execute_command(): starting 15621 1726882594.25734: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/ /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py && sleep 0' 15621 1726882594.27244: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882594.27248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882594.27257: stderr chunk (state=3): >>>debug2: match found <<< 15621 1726882594.27267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882594.27405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882594.27416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882594.27437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882594.27756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882594.29830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882594.29834: stderr chunk (state=3): >>><<< 15621 1726882594.29836: stdout chunk (state=3): >>><<< 15621 1726882594.29839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882594.29841: _low_level_execute_command(): starting 15621 1726882594.29844: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/AnsiballZ_systemd.py && sleep 0' 15621 1726882594.31142: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882594.31155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882594.31166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882594.31185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882594.31197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882594.31205: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882594.31218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882594.31239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882594.31456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882594.31460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882594.31465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882594.31487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882594.31659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882594.63408: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11825152", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3536642048", "CPUUsageNSec": "1833835000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15621 1726882594.65198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882594.65261: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882594.65453: stderr chunk (state=3): >>><<< 15621 1726882594.65456: stdout chunk (state=3): >>><<< 15621 1726882594.65479: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11825152", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3536642048", "CPUUsageNSec": "1833835000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882594.66328: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882594.66332: _low_level_execute_command(): starting 15621 1726882594.66335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882593.3764896-16559-52205225226227/ > /dev/null 2>&1 && sleep 0' 15621 1726882594.67298: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882594.67336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882594.67385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882594.67464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882594.67697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882594.67706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882594.69644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882594.69689: stderr chunk (state=3): >>><<< 15621 1726882594.69699: stdout chunk (state=3): >>><<< 15621 1726882594.70093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882594.70097: handler run complete 15621 1726882594.70100: attempt loop complete, returning result 15621 1726882594.70102: _execute() done 15621 1726882594.70104: dumping result to json 15621 1726882594.70106: done dumping result, returning 15621 1726882594.70109: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-af1a-5b92-000000000025] 15621 1726882594.70211: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000025 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882594.70813: no more pending results, returning what we have 15621 1726882594.70817: results queue empty 15621 1726882594.70818: checking for any_errors_fatal 15621 1726882594.70832: done checking for any_errors_fatal 15621 1726882594.70833: checking for max_fail_percentage 15621 1726882594.70835: done checking for max_fail_percentage 15621 1726882594.70835: checking to see if all hosts have failed and the running result is not ok 15621 1726882594.70837: done checking to see if all hosts have failed 15621 1726882594.70838: getting the remaining hosts for this loop 15621 1726882594.70839: done getting the remaining hosts for this loop 15621 1726882594.70845: getting the next task for host managed_node3 15621 1726882594.70852: done getting next task for host managed_node3 15621 1726882594.70857: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882594.70859: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882594.70870: getting variables 15621 1726882594.70875: in VariableManager get_vars() 15621 1726882594.70913: Calling all_inventory to load vars for managed_node3 15621 1726882594.70916: Calling groups_inventory to load vars for managed_node3 15621 1726882594.70918: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882594.70934: Calling all_plugins_play to load vars for managed_node3 15621 1726882594.70936: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882594.70939: Calling groups_plugins_play to load vars for managed_node3 15621 1726882594.72130: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000025 15621 1726882594.72134: WORKER PROCESS EXITING 15621 1726882594.74866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882594.79416: done with get_vars() 15621 1726882594.79558: done getting variables 15621 1726882594.79747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:34 -0400 (0:00:01.768) 0:00:26.877 ****** 15621 1726882594.79784: entering _queue_task() for managed_node3/service 15621 1726882594.80674: worker is 1 (out of 1 available) 15621 1726882594.80686: exiting _queue_task() for managed_node3/service 15621 1726882594.80700: done queuing things up, now waiting for results queue to drain 15621 1726882594.80702: waiting for pending results... 15621 1726882594.81167: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882594.81942: in run() - task 0affc7ec-ae25-af1a-5b92-000000000026 15621 1726882594.81946: variable 'ansible_search_path' from source: unknown 15621 1726882594.81949: variable 'ansible_search_path' from source: unknown 15621 1726882594.81952: calling self._execute() 15621 1726882594.82121: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882594.82376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882594.82380: variable 'omit' from source: magic vars 15621 1726882594.82911: variable 'ansible_distribution_major_version' from source: facts 15621 1726882594.82937: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882594.83066: variable 'network_provider' from source: set_fact 15621 1726882594.83081: Evaluated conditional (network_provider == "nm"): True 15621 1726882594.83182: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882594.83287: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882594.83501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882594.87236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882594.87430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882594.87527: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882594.87574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882594.87829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882594.87955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882594.88046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882594.88285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882594.88289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882594.88292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882594.88495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882594.88499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882594.88503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882594.88631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882594.88653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882594.88823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882594.88861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882594.88897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882594.88980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882594.89063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882594.89339: variable 'network_connections' from source: play vars 15621 1726882594.89444: variable 'interface' from source: set_fact 15621 1726882594.89805: variable 'interface' from source: set_fact 15621 1726882594.89810: variable 'interface' from source: set_fact 15621 1726882594.89813: variable 'interface' from source: set_fact 15621 1726882594.90008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882594.90502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882594.90548: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882594.90661: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882594.90893: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882594.90896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882594.90900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882594.91128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882594.91134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882594.91137: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882594.91820: variable 'network_connections' from source: play vars 15621 1726882594.91834: variable 'interface' from source: set_fact 15621 1726882594.92100: variable 'interface' from source: set_fact 15621 1726882594.92104: variable 'interface' from source: set_fact 15621 1726882594.92154: variable 'interface' from source: set_fact 15621 1726882594.92268: Evaluated conditional (__network_wpa_supplicant_required): False 15621 1726882594.92326: when evaluation is False, skipping this task 15621 1726882594.92337: _execute() done 15621 1726882594.92541: dumping result to json 15621 1726882594.92546: done dumping result, returning 15621 1726882594.92549: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-af1a-5b92-000000000026] 15621 1726882594.92551: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000026 15621 1726882594.92838: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000026 15621 1726882594.92842: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15621 1726882594.92896: no more pending results, returning what we have 15621 1726882594.92900: results queue empty 15621 1726882594.92902: checking for any_errors_fatal 15621 1726882594.92935: done checking for any_errors_fatal 15621 1726882594.92936: checking for max_fail_percentage 15621 1726882594.92938: done checking for max_fail_percentage 15621 1726882594.92939: checking to see if all hosts have failed and the running result is not ok 15621 1726882594.92944: done checking to see if all hosts have failed 15621 1726882594.92945: getting the remaining hosts for this loop 15621 1726882594.92947: done getting the remaining hosts for this loop 15621 1726882594.92951: getting the next task for host managed_node3 15621 1726882594.92958: done getting next task for host managed_node3 15621 1726882594.92962: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882594.92964: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882594.92983: getting variables 15621 1726882594.92985: in VariableManager get_vars() 15621 1726882594.93239: Calling all_inventory to load vars for managed_node3 15621 1726882594.93242: Calling groups_inventory to load vars for managed_node3 15621 1726882594.93245: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882594.93256: Calling all_plugins_play to load vars for managed_node3 15621 1726882594.93260: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882594.93263: Calling groups_plugins_play to load vars for managed_node3 15621 1726882594.96888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882595.00500: done with get_vars() 15621 1726882595.00544: done getting variables 15621 1726882595.00608: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:35 -0400 (0:00:00.208) 0:00:27.085 ****** 15621 1726882595.00644: entering _queue_task() for managed_node3/service 15621 1726882595.01001: worker is 1 (out of 1 available) 15621 1726882595.01014: exiting _queue_task() for managed_node3/service 15621 1726882595.01031: done queuing things up, now waiting for results queue to drain 15621 1726882595.01033: waiting for pending results... 15621 1726882595.01406: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882595.01465: in run() - task 0affc7ec-ae25-af1a-5b92-000000000027 15621 1726882595.01489: variable 'ansible_search_path' from source: unknown 15621 1726882595.01493: variable 'ansible_search_path' from source: unknown 15621 1726882595.01613: calling self._execute() 15621 1726882595.01692: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882595.01696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882595.01818: variable 'omit' from source: magic vars 15621 1726882595.02502: variable 'ansible_distribution_major_version' from source: facts 15621 1726882595.02516: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882595.02814: variable 'network_provider' from source: set_fact 15621 1726882595.02821: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882595.02826: when evaluation is False, skipping this task 15621 1726882595.02830: _execute() done 15621 1726882595.02833: dumping result to json 15621 1726882595.02838: done dumping result, returning 15621 1726882595.02847: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-af1a-5b92-000000000027] 15621 1726882595.02853: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000027 15621 1726882595.02966: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000027 15621 1726882595.02969: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882595.03068: no more pending results, returning what we have 15621 1726882595.03072: results queue empty 15621 1726882595.03073: checking for any_errors_fatal 15621 1726882595.03084: done checking for any_errors_fatal 15621 1726882595.03085: checking for max_fail_percentage 15621 1726882595.03087: done checking for max_fail_percentage 15621 1726882595.03088: checking to see if all hosts have failed and the running result is not ok 15621 1726882595.03089: done checking to see if all hosts have failed 15621 1726882595.03090: getting the remaining hosts for this loop 15621 1726882595.03091: done getting the remaining hosts for this loop 15621 1726882595.03096: getting the next task for host managed_node3 15621 1726882595.03103: done getting next task for host managed_node3 15621 1726882595.03106: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882595.03109: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882595.03128: getting variables 15621 1726882595.03131: in VariableManager get_vars() 15621 1726882595.03171: Calling all_inventory to load vars for managed_node3 15621 1726882595.03175: Calling groups_inventory to load vars for managed_node3 15621 1726882595.03177: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882595.03192: Calling all_plugins_play to load vars for managed_node3 15621 1726882595.03194: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882595.03197: Calling groups_plugins_play to load vars for managed_node3 15621 1726882595.07566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882595.09880: done with get_vars() 15621 1726882595.10100: done getting variables 15621 1726882595.10315: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:35 -0400 (0:00:00.098) 0:00:27.183 ****** 15621 1726882595.10458: entering _queue_task() for managed_node3/copy 15621 1726882595.11649: worker is 1 (out of 1 available) 15621 1726882595.11662: exiting _queue_task() for managed_node3/copy 15621 1726882595.11801: done queuing things up, now waiting for results queue to drain 15621 1726882595.11804: waiting for pending results... 15621 1726882595.12597: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882595.12602: in run() - task 0affc7ec-ae25-af1a-5b92-000000000028 15621 1726882595.12606: variable 'ansible_search_path' from source: unknown 15621 1726882595.12609: variable 'ansible_search_path' from source: unknown 15621 1726882595.12612: calling self._execute() 15621 1726882595.12615: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882595.12618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882595.12620: variable 'omit' from source: magic vars 15621 1726882595.13392: variable 'ansible_distribution_major_version' from source: facts 15621 1726882595.13404: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882595.13569: variable 'network_provider' from source: set_fact 15621 1726882595.13575: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882595.13579: when evaluation is False, skipping this task 15621 1726882595.13748: _execute() done 15621 1726882595.13752: dumping result to json 15621 1726882595.13754: done dumping result, returning 15621 1726882595.13760: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-af1a-5b92-000000000028] 15621 1726882595.13763: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000028 15621 1726882595.13841: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000028 15621 1726882595.13845: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15621 1726882595.13894: no more pending results, returning what we have 15621 1726882595.13898: results queue empty 15621 1726882595.13899: checking for any_errors_fatal 15621 1726882595.13905: done checking for any_errors_fatal 15621 1726882595.13906: checking for max_fail_percentage 15621 1726882595.13908: done checking for max_fail_percentage 15621 1726882595.13915: checking to see if all hosts have failed and the running result is not ok 15621 1726882595.13916: done checking to see if all hosts have failed 15621 1726882595.13917: getting the remaining hosts for this loop 15621 1726882595.13919: done getting the remaining hosts for this loop 15621 1726882595.13925: getting the next task for host managed_node3 15621 1726882595.13932: done getting next task for host managed_node3 15621 1726882595.13936: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882595.13938: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882595.13953: getting variables 15621 1726882595.13955: in VariableManager get_vars() 15621 1726882595.13999: Calling all_inventory to load vars for managed_node3 15621 1726882595.14002: Calling groups_inventory to load vars for managed_node3 15621 1726882595.14005: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882595.14018: Calling all_plugins_play to load vars for managed_node3 15621 1726882595.14021: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882595.14130: Calling groups_plugins_play to load vars for managed_node3 15621 1726882595.17125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882595.23108: done with get_vars() 15621 1726882595.23152: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:35 -0400 (0:00:00.128) 0:00:27.311 ****** 15621 1726882595.23263: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882595.23265: Creating lock for fedora.linux_system_roles.network_connections 15621 1726882595.23680: worker is 1 (out of 1 available) 15621 1726882595.23695: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882595.23825: done queuing things up, now waiting for results queue to drain 15621 1726882595.23827: waiting for pending results... 15621 1726882595.24188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882595.24195: in run() - task 0affc7ec-ae25-af1a-5b92-000000000029 15621 1726882595.24199: variable 'ansible_search_path' from source: unknown 15621 1726882595.24202: variable 'ansible_search_path' from source: unknown 15621 1726882595.24330: calling self._execute() 15621 1726882595.24336: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882595.24402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882595.24415: variable 'omit' from source: magic vars 15621 1726882595.24881: variable 'ansible_distribution_major_version' from source: facts 15621 1726882595.24895: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882595.24902: variable 'omit' from source: magic vars 15621 1726882595.24956: variable 'omit' from source: magic vars 15621 1726882595.25174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882595.31342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882595.31430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882595.31480: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882595.31528: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882595.31553: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882595.31646: variable 'network_provider' from source: set_fact 15621 1726882595.31846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882595.31872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882595.31901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882595.31955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882595.32028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882595.32052: variable 'omit' from source: magic vars 15621 1726882595.32185: variable 'omit' from source: magic vars 15621 1726882595.32295: variable 'network_connections' from source: play vars 15621 1726882595.32308: variable 'interface' from source: set_fact 15621 1726882595.32390: variable 'interface' from source: set_fact 15621 1726882595.32401: variable 'interface' from source: set_fact 15621 1726882595.32470: variable 'interface' from source: set_fact 15621 1726882595.32655: variable 'omit' from source: magic vars 15621 1726882595.32669: variable '__lsr_ansible_managed' from source: task vars 15621 1726882595.32761: variable '__lsr_ansible_managed' from source: task vars 15621 1726882595.33087: Loaded config def from plugin (lookup/template) 15621 1726882595.33090: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15621 1726882595.33148: File lookup term: get_ansible_managed.j2 15621 1726882595.33152: variable 'ansible_search_path' from source: unknown 15621 1726882595.33155: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15621 1726882595.33160: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15621 1726882595.33230: variable 'ansible_search_path' from source: unknown 15621 1726882595.43186: variable 'ansible_managed' from source: unknown 15621 1726882595.43518: variable 'omit' from source: magic vars 15621 1726882595.43632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882595.43647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882595.43682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882595.43702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882595.43738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882595.43829: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882595.43835: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882595.43837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882595.43983: Set connection var ansible_connection to ssh 15621 1726882595.44004: Set connection var ansible_shell_executable to /bin/sh 15621 1726882595.44050: Set connection var ansible_timeout to 10 15621 1726882595.44053: Set connection var ansible_shell_type to sh 15621 1726882595.44060: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882595.44063: Set connection var ansible_pipelining to False 15621 1726882595.44066: variable 'ansible_shell_executable' from source: unknown 15621 1726882595.44068: variable 'ansible_connection' from source: unknown 15621 1726882595.44077: variable 'ansible_module_compression' from source: unknown 15621 1726882595.44080: variable 'ansible_shell_type' from source: unknown 15621 1726882595.44084: variable 'ansible_shell_executable' from source: unknown 15621 1726882595.44087: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882595.44128: variable 'ansible_pipelining' from source: unknown 15621 1726882595.44132: variable 'ansible_timeout' from source: unknown 15621 1726882595.44134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882595.44486: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882595.44642: variable 'omit' from source: magic vars 15621 1726882595.44646: starting attempt loop 15621 1726882595.44650: running the handler 15621 1726882595.44652: _low_level_execute_command(): starting 15621 1726882595.44655: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882595.45850: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.45977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882595.45992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882595.46047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882595.46135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882595.47883: stdout chunk (state=3): >>>/root <<< 15621 1726882595.47996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882595.48197: stderr chunk (state=3): >>><<< 15621 1726882595.48201: stdout chunk (state=3): >>><<< 15621 1726882595.48204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882595.48206: _low_level_execute_command(): starting 15621 1726882595.48209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946 `" && echo ansible-tmp-1726882595.4815109-16626-108138958526946="` echo /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946 `" ) && sleep 0' 15621 1726882595.48977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882595.49078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.49101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882595.49120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882595.49147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882595.49261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882595.51268: stdout chunk (state=3): >>>ansible-tmp-1726882595.4815109-16626-108138958526946=/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946 <<< 15621 1726882595.51474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882595.51478: stdout chunk (state=3): >>><<< 15621 1726882595.51480: stderr chunk (state=3): >>><<< 15621 1726882595.51629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882595.4815109-16626-108138958526946=/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882595.51633: variable 'ansible_module_compression' from source: unknown 15621 1726882595.51635: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15621 1726882595.51637: ANSIBALLZ: Acquiring lock 15621 1726882595.51640: ANSIBALLZ: Lock acquired: 140146888386768 15621 1726882595.51650: ANSIBALLZ: Creating module 15621 1726882595.82526: ANSIBALLZ: Writing module into payload 15621 1726882595.83613: ANSIBALLZ: Writing module 15621 1726882595.83712: ANSIBALLZ: Renaming module 15621 1726882595.83718: ANSIBALLZ: Done creating module 15621 1726882595.83747: variable 'ansible_facts' from source: unknown 15621 1726882595.83971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py 15621 1726882595.84430: Sending initial data 15621 1726882595.84435: Sent initial data (168 bytes) 15621 1726882595.85756: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882595.85762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882595.85789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.85792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882595.85807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882595.85813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.85938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882595.86061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882595.86189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882595.87904: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15621 1726882595.87921: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15621 1726882595.87939: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882595.88055: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882595.88166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp9yiqrnqi /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py <<< 15621 1726882595.88171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py" <<< 15621 1726882595.88252: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp9yiqrnqi" to remote "/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py" <<< 15621 1726882595.89928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882595.89932: stdout chunk (state=3): >>><<< 15621 1726882595.89934: stderr chunk (state=3): >>><<< 15621 1726882595.89937: done transferring module to remote 15621 1726882595.89952: _low_level_execute_command(): starting 15621 1726882595.89966: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/ /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py && sleep 0' 15621 1726882595.90761: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.90806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882595.90830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882595.90866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882595.90987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882595.93079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882595.93091: stderr chunk (state=3): >>><<< 15621 1726882595.93098: stdout chunk (state=3): >>><<< 15621 1726882595.93119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882595.93129: _low_level_execute_command(): starting 15621 1726882595.93138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/AnsiballZ_network_connections.py && sleep 0' 15621 1726882595.94239: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882595.94289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882595.94306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882595.94338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882595.94495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882596.49460: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15621 1726882596.51283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882596.51328: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882596.51400: stderr chunk (state=3): >>><<< 15621 1726882596.51409: stdout chunk (state=3): >>><<< 15621 1726882596.51447: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882596.51500: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882596.51516: _low_level_execute_command(): starting 15621 1726882596.51540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882595.4815109-16626-108138958526946/ > /dev/null 2>&1 && sleep 0' 15621 1726882596.52241: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882596.52244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882596.52336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882596.52350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882596.52383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882596.52509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882596.54627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882596.54631: stderr chunk (state=3): >>><<< 15621 1726882596.54634: stdout chunk (state=3): >>><<< 15621 1726882596.54636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882596.54639: handler run complete 15621 1726882596.54645: attempt loop complete, returning result 15621 1726882596.54647: _execute() done 15621 1726882596.54649: dumping result to json 15621 1726882596.54652: done dumping result, returning 15621 1726882596.54654: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-af1a-5b92-000000000029] 15621 1726882596.54656: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000029 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active) 15621 1726882596.54978: no more pending results, returning what we have 15621 1726882596.54982: results queue empty 15621 1726882596.54983: checking for any_errors_fatal 15621 1726882596.54991: done checking for any_errors_fatal 15621 1726882596.54992: checking for max_fail_percentage 15621 1726882596.54994: done checking for max_fail_percentage 15621 1726882596.54995: checking to see if all hosts have failed and the running result is not ok 15621 1726882596.54996: done checking to see if all hosts have failed 15621 1726882596.54997: getting the remaining hosts for this loop 15621 1726882596.54999: done getting the remaining hosts for this loop 15621 1726882596.55003: getting the next task for host managed_node3 15621 1726882596.55009: done getting next task for host managed_node3 15621 1726882596.55013: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882596.55015: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882596.55025: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000029 15621 1726882596.55028: WORKER PROCESS EXITING 15621 1726882596.55144: getting variables 15621 1726882596.55146: in VariableManager get_vars() 15621 1726882596.55188: Calling all_inventory to load vars for managed_node3 15621 1726882596.55191: Calling groups_inventory to load vars for managed_node3 15621 1726882596.55193: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882596.55205: Calling all_plugins_play to load vars for managed_node3 15621 1726882596.55208: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882596.55212: Calling groups_plugins_play to load vars for managed_node3 15621 1726882596.57480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882596.59517: done with get_vars() 15621 1726882596.59552: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:36 -0400 (0:00:01.363) 0:00:28.675 ****** 15621 1726882596.59645: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882596.59647: Creating lock for fedora.linux_system_roles.network_state 15621 1726882596.60017: worker is 1 (out of 1 available) 15621 1726882596.60234: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882596.60246: done queuing things up, now waiting for results queue to drain 15621 1726882596.60248: waiting for pending results... 15621 1726882596.60341: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882596.60458: in run() - task 0affc7ec-ae25-af1a-5b92-00000000002a 15621 1726882596.60485: variable 'ansible_search_path' from source: unknown 15621 1726882596.60492: variable 'ansible_search_path' from source: unknown 15621 1726882596.60538: calling self._execute() 15621 1726882596.60640: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.60652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.60667: variable 'omit' from source: magic vars 15621 1726882596.61121: variable 'ansible_distribution_major_version' from source: facts 15621 1726882596.61127: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882596.61213: variable 'network_state' from source: role '' defaults 15621 1726882596.61235: Evaluated conditional (network_state != {}): False 15621 1726882596.61244: when evaluation is False, skipping this task 15621 1726882596.61253: _execute() done 15621 1726882596.61262: dumping result to json 15621 1726882596.61271: done dumping result, returning 15621 1726882596.61284: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-af1a-5b92-00000000002a] 15621 1726882596.61296: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002a skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882596.61461: no more pending results, returning what we have 15621 1726882596.61466: results queue empty 15621 1726882596.61467: checking for any_errors_fatal 15621 1726882596.61481: done checking for any_errors_fatal 15621 1726882596.61482: checking for max_fail_percentage 15621 1726882596.61484: done checking for max_fail_percentage 15621 1726882596.61485: checking to see if all hosts have failed and the running result is not ok 15621 1726882596.61487: done checking to see if all hosts have failed 15621 1726882596.61487: getting the remaining hosts for this loop 15621 1726882596.61489: done getting the remaining hosts for this loop 15621 1726882596.61494: getting the next task for host managed_node3 15621 1726882596.61501: done getting next task for host managed_node3 15621 1726882596.61505: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882596.61509: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882596.61529: getting variables 15621 1726882596.61531: in VariableManager get_vars() 15621 1726882596.61574: Calling all_inventory to load vars for managed_node3 15621 1726882596.61578: Calling groups_inventory to load vars for managed_node3 15621 1726882596.61580: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882596.61597: Calling all_plugins_play to load vars for managed_node3 15621 1726882596.61600: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882596.61603: Calling groups_plugins_play to load vars for managed_node3 15621 1726882596.62439: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002a 15621 1726882596.62443: WORKER PROCESS EXITING 15621 1726882596.63628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882596.65639: done with get_vars() 15621 1726882596.65667: done getting variables 15621 1726882596.65738: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:36 -0400 (0:00:00.061) 0:00:28.736 ****** 15621 1726882596.65772: entering _queue_task() for managed_node3/debug 15621 1726882596.66110: worker is 1 (out of 1 available) 15621 1726882596.66326: exiting _queue_task() for managed_node3/debug 15621 1726882596.66337: done queuing things up, now waiting for results queue to drain 15621 1726882596.66338: waiting for pending results... 15621 1726882596.66441: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882596.66560: in run() - task 0affc7ec-ae25-af1a-5b92-00000000002b 15621 1726882596.66583: variable 'ansible_search_path' from source: unknown 15621 1726882596.66590: variable 'ansible_search_path' from source: unknown 15621 1726882596.66635: calling self._execute() 15621 1726882596.66733: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.66747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.66762: variable 'omit' from source: magic vars 15621 1726882596.67152: variable 'ansible_distribution_major_version' from source: facts 15621 1726882596.67168: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882596.67179: variable 'omit' from source: magic vars 15621 1726882596.67326: variable 'omit' from source: magic vars 15621 1726882596.67331: variable 'omit' from source: magic vars 15621 1726882596.67335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882596.67366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882596.67392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882596.67417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.67440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.67477: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882596.67487: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.67495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.67609: Set connection var ansible_connection to ssh 15621 1726882596.67629: Set connection var ansible_shell_executable to /bin/sh 15621 1726882596.67644: Set connection var ansible_timeout to 10 15621 1726882596.67656: Set connection var ansible_shell_type to sh 15621 1726882596.67667: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882596.67678: Set connection var ansible_pipelining to False 15621 1726882596.67708: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.67717: variable 'ansible_connection' from source: unknown 15621 1726882596.67728: variable 'ansible_module_compression' from source: unknown 15621 1726882596.67759: variable 'ansible_shell_type' from source: unknown 15621 1726882596.67763: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.67765: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.67768: variable 'ansible_pipelining' from source: unknown 15621 1726882596.67770: variable 'ansible_timeout' from source: unknown 15621 1726882596.67772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.67977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882596.67981: variable 'omit' from source: magic vars 15621 1726882596.67984: starting attempt loop 15621 1726882596.67986: running the handler 15621 1726882596.68093: variable '__network_connections_result' from source: set_fact 15621 1726882596.68158: handler run complete 15621 1726882596.68184: attempt loop complete, returning result 15621 1726882596.68196: _execute() done 15621 1726882596.68204: dumping result to json 15621 1726882596.68213: done dumping result, returning 15621 1726882596.68229: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000002b] 15621 1726882596.68241: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002b 15621 1726882596.68367: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002b 15621 1726882596.68371: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active)" ] } 15621 1726882596.68474: no more pending results, returning what we have 15621 1726882596.68478: results queue empty 15621 1726882596.68479: checking for any_errors_fatal 15621 1726882596.68486: done checking for any_errors_fatal 15621 1726882596.68487: checking for max_fail_percentage 15621 1726882596.68489: done checking for max_fail_percentage 15621 1726882596.68490: checking to see if all hosts have failed and the running result is not ok 15621 1726882596.68492: done checking to see if all hosts have failed 15621 1726882596.68493: getting the remaining hosts for this loop 15621 1726882596.68494: done getting the remaining hosts for this loop 15621 1726882596.68499: getting the next task for host managed_node3 15621 1726882596.68505: done getting next task for host managed_node3 15621 1726882596.68508: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882596.68510: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882596.68523: getting variables 15621 1726882596.68525: in VariableManager get_vars() 15621 1726882596.68561: Calling all_inventory to load vars for managed_node3 15621 1726882596.68564: Calling groups_inventory to load vars for managed_node3 15621 1726882596.68566: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882596.68579: Calling all_plugins_play to load vars for managed_node3 15621 1726882596.68581: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882596.68584: Calling groups_plugins_play to load vars for managed_node3 15621 1726882596.70512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882596.72551: done with get_vars() 15621 1726882596.72589: done getting variables 15621 1726882596.72657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:36 -0400 (0:00:00.069) 0:00:28.806 ****** 15621 1726882596.72688: entering _queue_task() for managed_node3/debug 15621 1726882596.73044: worker is 1 (out of 1 available) 15621 1726882596.73058: exiting _queue_task() for managed_node3/debug 15621 1726882596.73071: done queuing things up, now waiting for results queue to drain 15621 1726882596.73073: waiting for pending results... 15621 1726882596.73453: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882596.73478: in run() - task 0affc7ec-ae25-af1a-5b92-00000000002c 15621 1726882596.73502: variable 'ansible_search_path' from source: unknown 15621 1726882596.73511: variable 'ansible_search_path' from source: unknown 15621 1726882596.73728: calling self._execute() 15621 1726882596.73733: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.73736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.73739: variable 'omit' from source: magic vars 15621 1726882596.74108: variable 'ansible_distribution_major_version' from source: facts 15621 1726882596.74129: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882596.74140: variable 'omit' from source: magic vars 15621 1726882596.74187: variable 'omit' from source: magic vars 15621 1726882596.74234: variable 'omit' from source: magic vars 15621 1726882596.74279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882596.74328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882596.74355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882596.74379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.74401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.74440: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882596.74449: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.74457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.74574: Set connection var ansible_connection to ssh 15621 1726882596.74590: Set connection var ansible_shell_executable to /bin/sh 15621 1726882596.74602: Set connection var ansible_timeout to 10 15621 1726882596.74614: Set connection var ansible_shell_type to sh 15621 1726882596.74627: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882596.74638: Set connection var ansible_pipelining to False 15621 1726882596.74666: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.74720: variable 'ansible_connection' from source: unknown 15621 1726882596.74724: variable 'ansible_module_compression' from source: unknown 15621 1726882596.74728: variable 'ansible_shell_type' from source: unknown 15621 1726882596.74730: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.74732: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.74735: variable 'ansible_pipelining' from source: unknown 15621 1726882596.74737: variable 'ansible_timeout' from source: unknown 15621 1726882596.74739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.74869: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882596.74887: variable 'omit' from source: magic vars 15621 1726882596.74897: starting attempt loop 15621 1726882596.74904: running the handler 15621 1726882596.75047: variable '__network_connections_result' from source: set_fact 15621 1726882596.75054: variable '__network_connections_result' from source: set_fact 15621 1726882596.75193: handler run complete 15621 1726882596.75232: attempt loop complete, returning result 15621 1726882596.75239: _execute() done 15621 1726882596.75246: dumping result to json 15621 1726882596.75255: done dumping result, returning 15621 1726882596.75271: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000002c] 15621 1726882596.75282: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002c 15621 1726882596.75560: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002c 15621 1726882596.75563: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 8aca4925-38c7-45a9-b2be-84b83d56f24f (not-active)" ] } } 15621 1726882596.75657: no more pending results, returning what we have 15621 1726882596.75661: results queue empty 15621 1726882596.75662: checking for any_errors_fatal 15621 1726882596.75669: done checking for any_errors_fatal 15621 1726882596.75670: checking for max_fail_percentage 15621 1726882596.75672: done checking for max_fail_percentage 15621 1726882596.75674: checking to see if all hosts have failed and the running result is not ok 15621 1726882596.75675: done checking to see if all hosts have failed 15621 1726882596.75676: getting the remaining hosts for this loop 15621 1726882596.75677: done getting the remaining hosts for this loop 15621 1726882596.75681: getting the next task for host managed_node3 15621 1726882596.75687: done getting next task for host managed_node3 15621 1726882596.75692: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882596.75694: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882596.75706: getting variables 15621 1726882596.75707: in VariableManager get_vars() 15621 1726882596.75901: Calling all_inventory to load vars for managed_node3 15621 1726882596.75905: Calling groups_inventory to load vars for managed_node3 15621 1726882596.75907: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882596.75917: Calling all_plugins_play to load vars for managed_node3 15621 1726882596.75920: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882596.75926: Calling groups_plugins_play to load vars for managed_node3 15621 1726882596.77548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882596.79585: done with get_vars() 15621 1726882596.79616: done getting variables 15621 1726882596.79683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:36 -0400 (0:00:00.070) 0:00:28.876 ****** 15621 1726882596.79715: entering _queue_task() for managed_node3/debug 15621 1726882596.80077: worker is 1 (out of 1 available) 15621 1726882596.80091: exiting _queue_task() for managed_node3/debug 15621 1726882596.80103: done queuing things up, now waiting for results queue to drain 15621 1726882596.80105: waiting for pending results... 15621 1726882596.80406: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882596.80542: in run() - task 0affc7ec-ae25-af1a-5b92-00000000002d 15621 1726882596.80569: variable 'ansible_search_path' from source: unknown 15621 1726882596.80577: variable 'ansible_search_path' from source: unknown 15621 1726882596.80626: calling self._execute() 15621 1726882596.80928: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.80932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.80936: variable 'omit' from source: magic vars 15621 1726882596.81161: variable 'ansible_distribution_major_version' from source: facts 15621 1726882596.81179: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882596.81314: variable 'network_state' from source: role '' defaults 15621 1726882596.81332: Evaluated conditional (network_state != {}): False 15621 1726882596.81339: when evaluation is False, skipping this task 15621 1726882596.81345: _execute() done 15621 1726882596.81351: dumping result to json 15621 1726882596.81358: done dumping result, returning 15621 1726882596.81369: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-af1a-5b92-00000000002d] 15621 1726882596.81384: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002d skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15621 1726882596.81547: no more pending results, returning what we have 15621 1726882596.81551: results queue empty 15621 1726882596.81552: checking for any_errors_fatal 15621 1726882596.81563: done checking for any_errors_fatal 15621 1726882596.81564: checking for max_fail_percentage 15621 1726882596.81566: done checking for max_fail_percentage 15621 1726882596.81567: checking to see if all hosts have failed and the running result is not ok 15621 1726882596.81568: done checking to see if all hosts have failed 15621 1726882596.81569: getting the remaining hosts for this loop 15621 1726882596.81571: done getting the remaining hosts for this loop 15621 1726882596.81576: getting the next task for host managed_node3 15621 1726882596.81583: done getting next task for host managed_node3 15621 1726882596.81587: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882596.81590: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882596.81607: getting variables 15621 1726882596.81609: in VariableManager get_vars() 15621 1726882596.81652: Calling all_inventory to load vars for managed_node3 15621 1726882596.81655: Calling groups_inventory to load vars for managed_node3 15621 1726882596.81657: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882596.81671: Calling all_plugins_play to load vars for managed_node3 15621 1726882596.81674: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882596.81676: Calling groups_plugins_play to load vars for managed_node3 15621 1726882596.82343: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002d 15621 1726882596.82347: WORKER PROCESS EXITING 15621 1726882596.83769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882596.85784: done with get_vars() 15621 1726882596.85817: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:36 -0400 (0:00:00.061) 0:00:28.938 ****** 15621 1726882596.85918: entering _queue_task() for managed_node3/ping 15621 1726882596.85920: Creating lock for ping 15621 1726882596.86293: worker is 1 (out of 1 available) 15621 1726882596.86306: exiting _queue_task() for managed_node3/ping 15621 1726882596.86323: done queuing things up, now waiting for results queue to drain 15621 1726882596.86325: waiting for pending results... 15621 1726882596.86627: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882596.86753: in run() - task 0affc7ec-ae25-af1a-5b92-00000000002e 15621 1726882596.86776: variable 'ansible_search_path' from source: unknown 15621 1726882596.86785: variable 'ansible_search_path' from source: unknown 15621 1726882596.86835: calling self._execute() 15621 1726882596.86943: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.86962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.86979: variable 'omit' from source: magic vars 15621 1726882596.87398: variable 'ansible_distribution_major_version' from source: facts 15621 1726882596.87417: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882596.87432: variable 'omit' from source: magic vars 15621 1726882596.87477: variable 'omit' from source: magic vars 15621 1726882596.87527: variable 'omit' from source: magic vars 15621 1726882596.87574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882596.87623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882596.87651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882596.87721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.87724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882596.87732: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882596.87741: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.87749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.87863: Set connection var ansible_connection to ssh 15621 1726882596.87876: Set connection var ansible_shell_executable to /bin/sh 15621 1726882596.87884: Set connection var ansible_timeout to 10 15621 1726882596.87890: Set connection var ansible_shell_type to sh 15621 1726882596.87897: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882596.87905: Set connection var ansible_pipelining to False 15621 1726882596.87938: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.87941: variable 'ansible_connection' from source: unknown 15621 1726882596.88046: variable 'ansible_module_compression' from source: unknown 15621 1726882596.88049: variable 'ansible_shell_type' from source: unknown 15621 1726882596.88051: variable 'ansible_shell_executable' from source: unknown 15621 1726882596.88054: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882596.88056: variable 'ansible_pipelining' from source: unknown 15621 1726882596.88058: variable 'ansible_timeout' from source: unknown 15621 1726882596.88060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882596.88205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882596.88224: variable 'omit' from source: magic vars 15621 1726882596.88234: starting attempt loop 15621 1726882596.88242: running the handler 15621 1726882596.88258: _low_level_execute_command(): starting 15621 1726882596.88269: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882596.89011: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882596.89030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882596.89052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882596.89166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882596.89266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882596.89363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882596.91136: stdout chunk (state=3): >>>/root <<< 15621 1726882596.91330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882596.91334: stdout chunk (state=3): >>><<< 15621 1726882596.91464: stderr chunk (state=3): >>><<< 15621 1726882596.91468: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882596.91470: _low_level_execute_command(): starting 15621 1726882596.91474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045 `" && echo ansible-tmp-1726882596.9137008-16689-188328703614045="` echo /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045 `" ) && sleep 0' 15621 1726882596.92071: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882596.92135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882596.92211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882596.92234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882596.92262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882596.92382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882596.94388: stdout chunk (state=3): >>>ansible-tmp-1726882596.9137008-16689-188328703614045=/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045 <<< 15621 1726882596.94728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882596.94731: stdout chunk (state=3): >>><<< 15621 1726882596.94734: stderr chunk (state=3): >>><<< 15621 1726882596.94737: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882596.9137008-16689-188328703614045=/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882596.94739: variable 'ansible_module_compression' from source: unknown 15621 1726882596.94742: ANSIBALLZ: Using lock for ping 15621 1726882596.94744: ANSIBALLZ: Acquiring lock 15621 1726882596.94746: ANSIBALLZ: Lock acquired: 140146886033088 15621 1726882596.94748: ANSIBALLZ: Creating module 15621 1726882597.16068: ANSIBALLZ: Writing module into payload 15621 1726882597.16154: ANSIBALLZ: Writing module 15621 1726882597.16228: ANSIBALLZ: Renaming module 15621 1726882597.16231: ANSIBALLZ: Done creating module 15621 1726882597.16233: variable 'ansible_facts' from source: unknown 15621 1726882597.16294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py 15621 1726882597.16497: Sending initial data 15621 1726882597.16501: Sent initial data (153 bytes) 15621 1726882597.17254: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882597.17307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882597.17339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882597.17369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882597.17501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882597.19248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882597.19323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882597.19409: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsd1og2li /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py <<< 15621 1726882597.19413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py" <<< 15621 1726882597.19512: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpsd1og2li" to remote "/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py" <<< 15621 1726882597.20492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882597.20533: stderr chunk (state=3): >>><<< 15621 1726882597.20542: stdout chunk (state=3): >>><<< 15621 1726882597.20574: done transferring module to remote 15621 1726882597.20675: _low_level_execute_command(): starting 15621 1726882597.20679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/ /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py && sleep 0' 15621 1726882597.21295: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882597.21311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882597.21331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882597.21349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882597.21379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882597.21393: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882597.21491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882597.21508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882597.21534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882597.21561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882597.21668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882597.23645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882597.23649: stdout chunk (state=3): >>><<< 15621 1726882597.23654: stderr chunk (state=3): >>><<< 15621 1726882597.23751: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882597.23755: _low_level_execute_command(): starting 15621 1726882597.23758: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/AnsiballZ_ping.py && sleep 0' 15621 1726882597.24556: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882597.24560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882597.24564: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882597.24567: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882597.24584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882597.24634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882597.24730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882597.41153: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15621 1726882597.42643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882597.42797: stderr chunk (state=3): >>><<< 15621 1726882597.42800: stdout chunk (state=3): >>><<< 15621 1726882597.42804: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882597.42807: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882597.42809: _low_level_execute_command(): starting 15621 1726882597.42812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882596.9137008-16689-188328703614045/ > /dev/null 2>&1 && sleep 0' 15621 1726882597.44030: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882597.44066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882597.44085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882597.44356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882597.46339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882597.46343: stdout chunk (state=3): >>><<< 15621 1726882597.46350: stderr chunk (state=3): >>><<< 15621 1726882597.46374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882597.46382: handler run complete 15621 1726882597.46400: attempt loop complete, returning result 15621 1726882597.46403: _execute() done 15621 1726882597.46406: dumping result to json 15621 1726882597.46410: done dumping result, returning 15621 1726882597.46420: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-af1a-5b92-00000000002e] 15621 1726882597.46426: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002e 15621 1726882597.46516: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000002e 15621 1726882597.46520: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15621 1726882597.46602: no more pending results, returning what we have 15621 1726882597.46605: results queue empty 15621 1726882597.46606: checking for any_errors_fatal 15621 1726882597.46611: done checking for any_errors_fatal 15621 1726882597.46612: checking for max_fail_percentage 15621 1726882597.46613: done checking for max_fail_percentage 15621 1726882597.46614: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.46615: done checking to see if all hosts have failed 15621 1726882597.46616: getting the remaining hosts for this loop 15621 1726882597.46617: done getting the remaining hosts for this loop 15621 1726882597.46621: getting the next task for host managed_node3 15621 1726882597.46630: done getting next task for host managed_node3 15621 1726882597.46633: ^ task is: TASK: meta (role_complete) 15621 1726882597.46635: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.46645: getting variables 15621 1726882597.46646: in VariableManager get_vars() 15621 1726882597.46686: Calling all_inventory to load vars for managed_node3 15621 1726882597.46689: Calling groups_inventory to load vars for managed_node3 15621 1726882597.46691: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.46702: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.46705: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.46708: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.49053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.60562: done with get_vars() 15621 1726882597.60606: done getting variables 15621 1726882597.60686: done queuing things up, now waiting for results queue to drain 15621 1726882597.60689: results queue empty 15621 1726882597.60690: checking for any_errors_fatal 15621 1726882597.60723: done checking for any_errors_fatal 15621 1726882597.60725: checking for max_fail_percentage 15621 1726882597.60726: done checking for max_fail_percentage 15621 1726882597.60727: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.60728: done checking to see if all hosts have failed 15621 1726882597.60728: getting the remaining hosts for this loop 15621 1726882597.60730: done getting the remaining hosts for this loop 15621 1726882597.60733: getting the next task for host managed_node3 15621 1726882597.60738: done getting next task for host managed_node3 15621 1726882597.60740: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 15621 1726882597.60742: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.60745: getting variables 15621 1726882597.60746: in VariableManager get_vars() 15621 1726882597.60759: Calling all_inventory to load vars for managed_node3 15621 1726882597.60762: Calling groups_inventory to load vars for managed_node3 15621 1726882597.60764: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.60770: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.60772: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.60775: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.62488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.65393: done with get_vars() 15621 1726882597.65428: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 21:36:37 -0400 (0:00:00.796) 0:00:29.734 ****** 15621 1726882597.65554: entering _queue_task() for managed_node3/include_tasks 15621 1726882597.66323: worker is 1 (out of 1 available) 15621 1726882597.66339: exiting _queue_task() for managed_node3/include_tasks 15621 1726882597.66484: done queuing things up, now waiting for results queue to drain 15621 1726882597.66486: waiting for pending results... 15621 1726882597.66947: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 15621 1726882597.66953: in run() - task 0affc7ec-ae25-af1a-5b92-000000000030 15621 1726882597.67149: variable 'ansible_search_path' from source: unknown 15621 1726882597.67191: calling self._execute() 15621 1726882597.67294: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.67299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.67302: variable 'omit' from source: magic vars 15621 1726882597.68113: variable 'ansible_distribution_major_version' from source: facts 15621 1726882597.68330: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882597.68338: _execute() done 15621 1726882597.68341: dumping result to json 15621 1726882597.68347: done dumping result, returning 15621 1726882597.68357: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0affc7ec-ae25-af1a-5b92-000000000030] 15621 1726882597.68362: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000030 15621 1726882597.68481: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000030 15621 1726882597.68484: WORKER PROCESS EXITING 15621 1726882597.68518: no more pending results, returning what we have 15621 1726882597.68527: in VariableManager get_vars() 15621 1726882597.68576: Calling all_inventory to load vars for managed_node3 15621 1726882597.68579: Calling groups_inventory to load vars for managed_node3 15621 1726882597.68581: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.68599: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.68601: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.68605: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.71633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.75189: done with get_vars() 15621 1726882597.75332: variable 'ansible_search_path' from source: unknown 15621 1726882597.75349: we have included files to process 15621 1726882597.75350: generating all_blocks data 15621 1726882597.75351: done generating all_blocks data 15621 1726882597.75357: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15621 1726882597.75358: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15621 1726882597.75360: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 15621 1726882597.75986: done processing included file 15621 1726882597.75989: iterating over new_blocks loaded from include file 15621 1726882597.75991: in VariableManager get_vars() 15621 1726882597.76009: done with get_vars() 15621 1726882597.76011: filtering new block on tags 15621 1726882597.76034: done filtering new block on tags 15621 1726882597.76037: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node3 15621 1726882597.76042: extending task lists for all hosts with included blocks 15621 1726882597.76082: done extending task lists 15621 1726882597.76083: done processing included files 15621 1726882597.76084: results queue empty 15621 1726882597.76085: checking for any_errors_fatal 15621 1726882597.76086: done checking for any_errors_fatal 15621 1726882597.76087: checking for max_fail_percentage 15621 1726882597.76088: done checking for max_fail_percentage 15621 1726882597.76089: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.76090: done checking to see if all hosts have failed 15621 1726882597.76091: getting the remaining hosts for this loop 15621 1726882597.76092: done getting the remaining hosts for this loop 15621 1726882597.76095: getting the next task for host managed_node3 15621 1726882597.76100: done getting next task for host managed_node3 15621 1726882597.76102: ^ task is: TASK: Assert that warnings is empty 15621 1726882597.76104: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.76106: getting variables 15621 1726882597.76107: in VariableManager get_vars() 15621 1726882597.76120: Calling all_inventory to load vars for managed_node3 15621 1726882597.76125: Calling groups_inventory to load vars for managed_node3 15621 1726882597.76127: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.76134: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.76137: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.76140: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.78091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.81555: done with get_vars() 15621 1726882597.81584: done getting variables 15621 1726882597.81633: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 21:36:37 -0400 (0:00:00.161) 0:00:29.895 ****** 15621 1726882597.81664: entering _queue_task() for managed_node3/assert 15621 1726882597.82143: worker is 1 (out of 1 available) 15621 1726882597.82155: exiting _queue_task() for managed_node3/assert 15621 1726882597.82167: done queuing things up, now waiting for results queue to drain 15621 1726882597.82169: waiting for pending results... 15621 1726882597.82471: running TaskExecutor() for managed_node3/TASK: Assert that warnings is empty 15621 1726882597.82541: in run() - task 0affc7ec-ae25-af1a-5b92-000000000304 15621 1726882597.82546: variable 'ansible_search_path' from source: unknown 15621 1726882597.82549: variable 'ansible_search_path' from source: unknown 15621 1726882597.82587: calling self._execute() 15621 1726882597.82696: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.82758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.82764: variable 'omit' from source: magic vars 15621 1726882597.83176: variable 'ansible_distribution_major_version' from source: facts 15621 1726882597.83204: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882597.83223: variable 'omit' from source: magic vars 15621 1726882597.83273: variable 'omit' from source: magic vars 15621 1726882597.83406: variable 'omit' from source: magic vars 15621 1726882597.83409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882597.83419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882597.83449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882597.83473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882597.83491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882597.83533: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882597.83547: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.83556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.83678: Set connection var ansible_connection to ssh 15621 1726882597.83695: Set connection var ansible_shell_executable to /bin/sh 15621 1726882597.83709: Set connection var ansible_timeout to 10 15621 1726882597.83718: Set connection var ansible_shell_type to sh 15621 1726882597.83731: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882597.83769: Set connection var ansible_pipelining to False 15621 1726882597.83783: variable 'ansible_shell_executable' from source: unknown 15621 1726882597.83792: variable 'ansible_connection' from source: unknown 15621 1726882597.83800: variable 'ansible_module_compression' from source: unknown 15621 1726882597.83808: variable 'ansible_shell_type' from source: unknown 15621 1726882597.83877: variable 'ansible_shell_executable' from source: unknown 15621 1726882597.83880: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.83883: variable 'ansible_pipelining' from source: unknown 15621 1726882597.83886: variable 'ansible_timeout' from source: unknown 15621 1726882597.83888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.84011: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882597.84034: variable 'omit' from source: magic vars 15621 1726882597.84055: starting attempt loop 15621 1726882597.84064: running the handler 15621 1726882597.84229: variable '__network_connections_result' from source: set_fact 15621 1726882597.84248: Evaluated conditional ('warnings' not in __network_connections_result): True 15621 1726882597.84260: handler run complete 15621 1726882597.84311: attempt loop complete, returning result 15621 1726882597.84314: _execute() done 15621 1726882597.84316: dumping result to json 15621 1726882597.84318: done dumping result, returning 15621 1726882597.84320: done running TaskExecutor() for managed_node3/TASK: Assert that warnings is empty [0affc7ec-ae25-af1a-5b92-000000000304] 15621 1726882597.84324: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000304 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15621 1726882597.84464: no more pending results, returning what we have 15621 1726882597.84467: results queue empty 15621 1726882597.84468: checking for any_errors_fatal 15621 1726882597.84470: done checking for any_errors_fatal 15621 1726882597.84471: checking for max_fail_percentage 15621 1726882597.84472: done checking for max_fail_percentage 15621 1726882597.84473: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.84475: done checking to see if all hosts have failed 15621 1726882597.84475: getting the remaining hosts for this loop 15621 1726882597.84477: done getting the remaining hosts for this loop 15621 1726882597.84481: getting the next task for host managed_node3 15621 1726882597.84488: done getting next task for host managed_node3 15621 1726882597.84491: ^ task is: TASK: Assert that there is output in stderr 15621 1726882597.84493: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.84497: getting variables 15621 1726882597.84499: in VariableManager get_vars() 15621 1726882597.84539: Calling all_inventory to load vars for managed_node3 15621 1726882597.84542: Calling groups_inventory to load vars for managed_node3 15621 1726882597.84545: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.84559: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.84562: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.84565: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.85844: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000304 15621 1726882597.85847: WORKER PROCESS EXITING 15621 1726882597.86181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.87335: done with get_vars() 15621 1726882597.87352: done getting variables 15621 1726882597.87396: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 21:36:37 -0400 (0:00:00.057) 0:00:29.953 ****** 15621 1726882597.87426: entering _queue_task() for managed_node3/assert 15621 1726882597.87713: worker is 1 (out of 1 available) 15621 1726882597.87726: exiting _queue_task() for managed_node3/assert 15621 1726882597.87739: done queuing things up, now waiting for results queue to drain 15621 1726882597.87741: waiting for pending results... 15621 1726882597.88041: running TaskExecutor() for managed_node3/TASK: Assert that there is output in stderr 15621 1726882597.88097: in run() - task 0affc7ec-ae25-af1a-5b92-000000000305 15621 1726882597.88118: variable 'ansible_search_path' from source: unknown 15621 1726882597.88129: variable 'ansible_search_path' from source: unknown 15621 1726882597.88170: calling self._execute() 15621 1726882597.88266: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.88280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.88295: variable 'omit' from source: magic vars 15621 1726882597.88759: variable 'ansible_distribution_major_version' from source: facts 15621 1726882597.88768: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882597.88776: variable 'omit' from source: magic vars 15621 1726882597.88804: variable 'omit' from source: magic vars 15621 1726882597.88832: variable 'omit' from source: magic vars 15621 1726882597.88868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882597.88899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882597.88914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882597.88930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882597.88940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882597.88967: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882597.88974: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.88977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.89054: Set connection var ansible_connection to ssh 15621 1726882597.89061: Set connection var ansible_shell_executable to /bin/sh 15621 1726882597.89075: Set connection var ansible_timeout to 10 15621 1726882597.89081: Set connection var ansible_shell_type to sh 15621 1726882597.89084: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882597.89088: Set connection var ansible_pipelining to False 15621 1726882597.89102: variable 'ansible_shell_executable' from source: unknown 15621 1726882597.89105: variable 'ansible_connection' from source: unknown 15621 1726882597.89108: variable 'ansible_module_compression' from source: unknown 15621 1726882597.89110: variable 'ansible_shell_type' from source: unknown 15621 1726882597.89112: variable 'ansible_shell_executable' from source: unknown 15621 1726882597.89115: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882597.89120: variable 'ansible_pipelining' from source: unknown 15621 1726882597.89124: variable 'ansible_timeout' from source: unknown 15621 1726882597.89128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882597.89237: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882597.89246: variable 'omit' from source: magic vars 15621 1726882597.89251: starting attempt loop 15621 1726882597.89254: running the handler 15621 1726882597.89347: variable '__network_connections_result' from source: set_fact 15621 1726882597.89357: Evaluated conditional ('stderr' in __network_connections_result): True 15621 1726882597.89363: handler run complete 15621 1726882597.89376: attempt loop complete, returning result 15621 1726882597.89379: _execute() done 15621 1726882597.89382: dumping result to json 15621 1726882597.89384: done dumping result, returning 15621 1726882597.89391: done running TaskExecutor() for managed_node3/TASK: Assert that there is output in stderr [0affc7ec-ae25-af1a-5b92-000000000305] 15621 1726882597.89399: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000305 15621 1726882597.89490: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000305 15621 1726882597.89493: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15621 1726882597.89559: no more pending results, returning what we have 15621 1726882597.89562: results queue empty 15621 1726882597.89563: checking for any_errors_fatal 15621 1726882597.89569: done checking for any_errors_fatal 15621 1726882597.89569: checking for max_fail_percentage 15621 1726882597.89574: done checking for max_fail_percentage 15621 1726882597.89574: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.89575: done checking to see if all hosts have failed 15621 1726882597.89576: getting the remaining hosts for this loop 15621 1726882597.89577: done getting the remaining hosts for this loop 15621 1726882597.89581: getting the next task for host managed_node3 15621 1726882597.89588: done getting next task for host managed_node3 15621 1726882597.89590: ^ task is: TASK: meta (flush_handlers) 15621 1726882597.89592: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.89596: getting variables 15621 1726882597.89597: in VariableManager get_vars() 15621 1726882597.89631: Calling all_inventory to load vars for managed_node3 15621 1726882597.89634: Calling groups_inventory to load vars for managed_node3 15621 1726882597.89636: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.89645: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.89648: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.89651: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.90698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.91975: done with get_vars() 15621 1726882597.91998: done getting variables 15621 1726882597.92068: in VariableManager get_vars() 15621 1726882597.92083: Calling all_inventory to load vars for managed_node3 15621 1726882597.92086: Calling groups_inventory to load vars for managed_node3 15621 1726882597.92088: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.92093: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.92096: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.92099: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.93612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.95580: done with get_vars() 15621 1726882597.95601: done queuing things up, now waiting for results queue to drain 15621 1726882597.95603: results queue empty 15621 1726882597.95603: checking for any_errors_fatal 15621 1726882597.95605: done checking for any_errors_fatal 15621 1726882597.95606: checking for max_fail_percentage 15621 1726882597.95606: done checking for max_fail_percentage 15621 1726882597.95607: checking to see if all hosts have failed and the running result is not ok 15621 1726882597.95607: done checking to see if all hosts have failed 15621 1726882597.95608: getting the remaining hosts for this loop 15621 1726882597.95613: done getting the remaining hosts for this loop 15621 1726882597.95616: getting the next task for host managed_node3 15621 1726882597.95619: done getting next task for host managed_node3 15621 1726882597.95621: ^ task is: TASK: meta (flush_handlers) 15621 1726882597.95623: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882597.95627: getting variables 15621 1726882597.95628: in VariableManager get_vars() 15621 1726882597.95637: Calling all_inventory to load vars for managed_node3 15621 1726882597.95639: Calling groups_inventory to load vars for managed_node3 15621 1726882597.95640: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.95644: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.95645: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.95647: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.96460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882597.98203: done with get_vars() 15621 1726882597.98228: done getting variables 15621 1726882597.98271: in VariableManager get_vars() 15621 1726882597.98280: Calling all_inventory to load vars for managed_node3 15621 1726882597.98281: Calling groups_inventory to load vars for managed_node3 15621 1726882597.98283: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882597.98286: Calling all_plugins_play to load vars for managed_node3 15621 1726882597.98288: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882597.98290: Calling groups_plugins_play to load vars for managed_node3 15621 1726882597.99620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882598.00839: done with get_vars() 15621 1726882598.00864: done queuing things up, now waiting for results queue to drain 15621 1726882598.00866: results queue empty 15621 1726882598.00866: checking for any_errors_fatal 15621 1726882598.00867: done checking for any_errors_fatal 15621 1726882598.00868: checking for max_fail_percentage 15621 1726882598.00868: done checking for max_fail_percentage 15621 1726882598.00869: checking to see if all hosts have failed and the running result is not ok 15621 1726882598.00869: done checking to see if all hosts have failed 15621 1726882598.00870: getting the remaining hosts for this loop 15621 1726882598.00871: done getting the remaining hosts for this loop 15621 1726882598.00873: getting the next task for host managed_node3 15621 1726882598.00876: done getting next task for host managed_node3 15621 1726882598.00876: ^ task is: None 15621 1726882598.00877: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882598.00878: done queuing things up, now waiting for results queue to drain 15621 1726882598.00879: results queue empty 15621 1726882598.00879: checking for any_errors_fatal 15621 1726882598.00879: done checking for any_errors_fatal 15621 1726882598.00880: checking for max_fail_percentage 15621 1726882598.00880: done checking for max_fail_percentage 15621 1726882598.00881: checking to see if all hosts have failed and the running result is not ok 15621 1726882598.00881: done checking to see if all hosts have failed 15621 1726882598.00882: getting the next task for host managed_node3 15621 1726882598.00885: done getting next task for host managed_node3 15621 1726882598.00886: ^ task is: None 15621 1726882598.00886: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882598.00924: in VariableManager get_vars() 15621 1726882598.00937: done with get_vars() 15621 1726882598.00941: in VariableManager get_vars() 15621 1726882598.00946: done with get_vars() 15621 1726882598.00949: variable 'omit' from source: magic vars 15621 1726882598.00974: in VariableManager get_vars() 15621 1726882598.00981: done with get_vars() 15621 1726882598.00996: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 15621 1726882598.01123: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882598.01143: getting the remaining hosts for this loop 15621 1726882598.01144: done getting the remaining hosts for this loop 15621 1726882598.01146: getting the next task for host managed_node3 15621 1726882598.01147: done getting next task for host managed_node3 15621 1726882598.01149: ^ task is: TASK: Gathering Facts 15621 1726882598.01150: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882598.01151: getting variables 15621 1726882598.01152: in VariableManager get_vars() 15621 1726882598.01158: Calling all_inventory to load vars for managed_node3 15621 1726882598.01159: Calling groups_inventory to load vars for managed_node3 15621 1726882598.01161: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882598.01165: Calling all_plugins_play to load vars for managed_node3 15621 1726882598.01167: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882598.01169: Calling groups_plugins_play to load vars for managed_node3 15621 1726882598.01989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882598.03660: done with get_vars() 15621 1726882598.03678: done getting variables 15621 1726882598.03711: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 21:36:38 -0400 (0:00:00.163) 0:00:30.116 ****** 15621 1726882598.03732: entering _queue_task() for managed_node3/gather_facts 15621 1726882598.03990: worker is 1 (out of 1 available) 15621 1726882598.04002: exiting _queue_task() for managed_node3/gather_facts 15621 1726882598.04013: done queuing things up, now waiting for results queue to drain 15621 1726882598.04015: waiting for pending results... 15621 1726882598.04216: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882598.04294: in run() - task 0affc7ec-ae25-af1a-5b92-000000000316 15621 1726882598.04307: variable 'ansible_search_path' from source: unknown 15621 1726882598.04344: calling self._execute() 15621 1726882598.04413: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882598.04417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882598.04427: variable 'omit' from source: magic vars 15621 1726882598.04764: variable 'ansible_distribution_major_version' from source: facts 15621 1726882598.04768: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882598.04774: variable 'omit' from source: magic vars 15621 1726882598.04783: variable 'omit' from source: magic vars 15621 1726882598.04813: variable 'omit' from source: magic vars 15621 1726882598.04847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882598.04900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882598.04905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882598.04908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882598.04920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882598.04947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882598.04950: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882598.04953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882598.05031: Set connection var ansible_connection to ssh 15621 1726882598.05038: Set connection var ansible_shell_executable to /bin/sh 15621 1726882598.05044: Set connection var ansible_timeout to 10 15621 1726882598.05047: Set connection var ansible_shell_type to sh 15621 1726882598.05052: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882598.05058: Set connection var ansible_pipelining to False 15621 1726882598.05077: variable 'ansible_shell_executable' from source: unknown 15621 1726882598.05080: variable 'ansible_connection' from source: unknown 15621 1726882598.05083: variable 'ansible_module_compression' from source: unknown 15621 1726882598.05086: variable 'ansible_shell_type' from source: unknown 15621 1726882598.05088: variable 'ansible_shell_executable' from source: unknown 15621 1726882598.05090: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882598.05095: variable 'ansible_pipelining' from source: unknown 15621 1726882598.05098: variable 'ansible_timeout' from source: unknown 15621 1726882598.05102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882598.05250: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882598.05260: variable 'omit' from source: magic vars 15621 1726882598.05264: starting attempt loop 15621 1726882598.05267: running the handler 15621 1726882598.05282: variable 'ansible_facts' from source: unknown 15621 1726882598.05298: _low_level_execute_command(): starting 15621 1726882598.05305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882598.05828: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.05851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.05856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.05909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882598.05912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882598.05928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882598.06020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882598.07792: stdout chunk (state=3): >>>/root <<< 15621 1726882598.07913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882598.07999: stderr chunk (state=3): >>><<< 15621 1726882598.08002: stdout chunk (state=3): >>><<< 15621 1726882598.08074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882598.08078: _low_level_execute_command(): starting 15621 1726882598.08082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633 `" && echo ansible-tmp-1726882598.080208-16753-243675909377633="` echo /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633 `" ) && sleep 0' 15621 1726882598.08656: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.08659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882598.08662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.08670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.08673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.08730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882598.08755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882598.08892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882598.10868: stdout chunk (state=3): >>>ansible-tmp-1726882598.080208-16753-243675909377633=/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633 <<< 15621 1726882598.10993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882598.11077: stderr chunk (state=3): >>><<< 15621 1726882598.11085: stdout chunk (state=3): >>><<< 15621 1726882598.11112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882598.080208-16753-243675909377633=/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882598.11255: variable 'ansible_module_compression' from source: unknown 15621 1726882598.11259: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882598.11373: variable 'ansible_facts' from source: unknown 15621 1726882598.11463: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py 15621 1726882598.11656: Sending initial data 15621 1726882598.11660: Sent initial data (153 bytes) 15621 1726882598.12519: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882598.12541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882598.12559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882598.12685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882598.14274: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15621 1726882598.14278: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882598.14351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882598.14463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp2qi9bwts /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py <<< 15621 1726882598.14467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py" <<< 15621 1726882598.14557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp2qi9bwts" to remote "/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py" <<< 15621 1726882598.16074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882598.16249: stderr chunk (state=3): >>><<< 15621 1726882598.16252: stdout chunk (state=3): >>><<< 15621 1726882598.16254: done transferring module to remote 15621 1726882598.16256: _low_level_execute_command(): starting 15621 1726882598.16259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/ /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py && sleep 0' 15621 1726882598.16831: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882598.16835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882598.16838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882598.16840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.16842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.16903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882598.16912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882598.16916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882598.17002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882598.18795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882598.18844: stderr chunk (state=3): >>><<< 15621 1726882598.18847: stdout chunk (state=3): >>><<< 15621 1726882598.18863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882598.18866: _low_level_execute_command(): starting 15621 1726882598.18870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/AnsiballZ_setup.py && sleep 0' 15621 1726882598.19326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.19329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.19332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882598.19334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882598.19336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882598.19387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882598.19394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882598.19480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.26741: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "38", "epoch": "1726882598", "epoch_int": "1726882598", "date": "2024-09-20", "time": "21:36:38", "iso8601_micro": "2024-09-21T01:36:38.483107Z", "iso8601": "2024-09-21T01:36:38Z", "iso8601_basic": "20240920T213638483107", "iso8601_basic_short": "20240920T213638", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.748046875, "5m": 0.650390625, "15m": 0.328125}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3117, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 599, "free": 3117}, "nocache": {"free": 3500, "used": 216}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384279040, "block_size": 4096, "block_total": 64483404, "block_available": 61373115, "block_used": 3110289, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lsr27", "peerlsr27", "lo"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882600.28818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882600.28825: stdout chunk (state=3): >>><<< 15621 1726882600.28828: stderr chunk (state=3): >>><<< 15621 1726882600.28910: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "38", "epoch": "1726882598", "epoch_int": "1726882598", "date": "2024-09-20", "time": "21:36:38", "iso8601_micro": "2024-09-21T01:36:38.483107Z", "iso8601": "2024-09-21T01:36:38Z", "iso8601_basic": "20240920T213638483107", "iso8601_basic_short": "20240920T213638", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.748046875, "5m": 0.650390625, "15m": 0.328125}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3117, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 599, "free": 3117}, "nocache": {"free": 3500, "used": 216}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384279040, "block_size": 4096, "block_total": 64483404, "block_available": 61373115, "block_used": 3110289, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lsr27", "peerlsr27", "lo"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882600.29380: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882600.29384: _low_level_execute_command(): starting 15621 1726882600.29387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882598.080208-16753-243675909377633/ > /dev/null 2>&1 && sleep 0' 15621 1726882600.30062: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882600.30149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882600.30290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882600.30337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.30452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.30619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.32612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882600.32615: stdout chunk (state=3): >>><<< 15621 1726882600.32617: stderr chunk (state=3): >>><<< 15621 1726882600.32829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882600.32832: handler run complete 15621 1726882600.32835: variable 'ansible_facts' from source: unknown 15621 1726882600.33120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.33589: variable 'ansible_facts' from source: unknown 15621 1726882600.33692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.33881: attempt loop complete, returning result 15621 1726882600.33890: _execute() done 15621 1726882600.33898: dumping result to json 15621 1726882600.33953: done dumping result, returning 15621 1726882600.33989: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-000000000316] 15621 1726882600.34052: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000316 15621 1726882600.34702: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000316 15621 1726882600.34706: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882600.35388: no more pending results, returning what we have 15621 1726882600.35392: results queue empty 15621 1726882600.35393: checking for any_errors_fatal 15621 1726882600.35394: done checking for any_errors_fatal 15621 1726882600.35395: checking for max_fail_percentage 15621 1726882600.35397: done checking for max_fail_percentage 15621 1726882600.35398: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.35399: done checking to see if all hosts have failed 15621 1726882600.35400: getting the remaining hosts for this loop 15621 1726882600.35401: done getting the remaining hosts for this loop 15621 1726882600.35405: getting the next task for host managed_node3 15621 1726882600.35410: done getting next task for host managed_node3 15621 1726882600.35412: ^ task is: TASK: meta (flush_handlers) 15621 1726882600.35414: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.35417: getting variables 15621 1726882600.35419: in VariableManager get_vars() 15621 1726882600.35445: Calling all_inventory to load vars for managed_node3 15621 1726882600.35448: Calling groups_inventory to load vars for managed_node3 15621 1726882600.35451: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.35469: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.35475: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.35478: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.37269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.39162: done with get_vars() 15621 1726882600.39201: done getting variables 15621 1726882600.39302: in VariableManager get_vars() 15621 1726882600.39312: Calling all_inventory to load vars for managed_node3 15621 1726882600.39320: Calling groups_inventory to load vars for managed_node3 15621 1726882600.39325: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.39335: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.39338: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.39347: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.40891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.43067: done with get_vars() 15621 1726882600.43100: done queuing things up, now waiting for results queue to drain 15621 1726882600.43102: results queue empty 15621 1726882600.43103: checking for any_errors_fatal 15621 1726882600.43107: done checking for any_errors_fatal 15621 1726882600.43108: checking for max_fail_percentage 15621 1726882600.43109: done checking for max_fail_percentage 15621 1726882600.43110: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.43115: done checking to see if all hosts have failed 15621 1726882600.43116: getting the remaining hosts for this loop 15621 1726882600.43117: done getting the remaining hosts for this loop 15621 1726882600.43120: getting the next task for host managed_node3 15621 1726882600.43127: done getting next task for host managed_node3 15621 1726882600.43130: ^ task is: TASK: Show network_provider 15621 1726882600.43132: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.43134: getting variables 15621 1726882600.43135: in VariableManager get_vars() 15621 1726882600.43145: Calling all_inventory to load vars for managed_node3 15621 1726882600.43148: Calling groups_inventory to load vars for managed_node3 15621 1726882600.43150: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.43156: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.43158: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.43161: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.44629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.46153: done with get_vars() 15621 1726882600.46180: done getting variables 15621 1726882600.46231: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 21:36:40 -0400 (0:00:02.425) 0:00:32.541 ****** 15621 1726882600.46259: entering _queue_task() for managed_node3/debug 15621 1726882600.46615: worker is 1 (out of 1 available) 15621 1726882600.46630: exiting _queue_task() for managed_node3/debug 15621 1726882600.46644: done queuing things up, now waiting for results queue to drain 15621 1726882600.46645: waiting for pending results... 15621 1726882600.47046: running TaskExecutor() for managed_node3/TASK: Show network_provider 15621 1726882600.47067: in run() - task 0affc7ec-ae25-af1a-5b92-000000000033 15621 1726882600.47095: variable 'ansible_search_path' from source: unknown 15621 1726882600.47150: calling self._execute() 15621 1726882600.47261: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.47279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.47295: variable 'omit' from source: magic vars 15621 1726882600.47746: variable 'ansible_distribution_major_version' from source: facts 15621 1726882600.47766: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882600.47782: variable 'omit' from source: magic vars 15621 1726882600.47827: variable 'omit' from source: magic vars 15621 1726882600.47874: variable 'omit' from source: magic vars 15621 1726882600.47930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882600.47980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882600.48011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882600.48040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882600.48059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882600.48101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882600.48112: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.48127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.48339: Set connection var ansible_connection to ssh 15621 1726882600.48343: Set connection var ansible_shell_executable to /bin/sh 15621 1726882600.48345: Set connection var ansible_timeout to 10 15621 1726882600.48347: Set connection var ansible_shell_type to sh 15621 1726882600.48349: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882600.48351: Set connection var ansible_pipelining to False 15621 1726882600.48353: variable 'ansible_shell_executable' from source: unknown 15621 1726882600.48355: variable 'ansible_connection' from source: unknown 15621 1726882600.48357: variable 'ansible_module_compression' from source: unknown 15621 1726882600.48360: variable 'ansible_shell_type' from source: unknown 15621 1726882600.48362: variable 'ansible_shell_executable' from source: unknown 15621 1726882600.48364: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.48366: variable 'ansible_pipelining' from source: unknown 15621 1726882600.48368: variable 'ansible_timeout' from source: unknown 15621 1726882600.48370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.48532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882600.48553: variable 'omit' from source: magic vars 15621 1726882600.48568: starting attempt loop 15621 1726882600.48579: running the handler 15621 1726882600.48635: variable 'network_provider' from source: set_fact 15621 1726882600.48727: variable 'network_provider' from source: set_fact 15621 1726882600.48744: handler run complete 15621 1726882600.48770: attempt loop complete, returning result 15621 1726882600.48787: _execute() done 15621 1726882600.48828: dumping result to json 15621 1726882600.48834: done dumping result, returning 15621 1726882600.48838: done running TaskExecutor() for managed_node3/TASK: Show network_provider [0affc7ec-ae25-af1a-5b92-000000000033] 15621 1726882600.48842: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000033 15621 1726882600.49076: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000033 15621 1726882600.49081: WORKER PROCESS EXITING ok: [managed_node3] => { "network_provider": "nm" } 15621 1726882600.49137: no more pending results, returning what we have 15621 1726882600.49141: results queue empty 15621 1726882600.49142: checking for any_errors_fatal 15621 1726882600.49145: done checking for any_errors_fatal 15621 1726882600.49146: checking for max_fail_percentage 15621 1726882600.49147: done checking for max_fail_percentage 15621 1726882600.49148: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.49150: done checking to see if all hosts have failed 15621 1726882600.49151: getting the remaining hosts for this loop 15621 1726882600.49152: done getting the remaining hosts for this loop 15621 1726882600.49157: getting the next task for host managed_node3 15621 1726882600.49165: done getting next task for host managed_node3 15621 1726882600.49168: ^ task is: TASK: meta (flush_handlers) 15621 1726882600.49170: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.49177: getting variables 15621 1726882600.49179: in VariableManager get_vars() 15621 1726882600.49212: Calling all_inventory to load vars for managed_node3 15621 1726882600.49215: Calling groups_inventory to load vars for managed_node3 15621 1726882600.49220: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.49237: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.49240: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.49244: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.51079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.53284: done with get_vars() 15621 1726882600.53307: done getting variables 15621 1726882600.53385: in VariableManager get_vars() 15621 1726882600.53396: Calling all_inventory to load vars for managed_node3 15621 1726882600.53398: Calling groups_inventory to load vars for managed_node3 15621 1726882600.53401: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.53406: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.53408: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.53411: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.54852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.58652: done with get_vars() 15621 1726882600.58690: done queuing things up, now waiting for results queue to drain 15621 1726882600.58692: results queue empty 15621 1726882600.58693: checking for any_errors_fatal 15621 1726882600.58697: done checking for any_errors_fatal 15621 1726882600.58698: checking for max_fail_percentage 15621 1726882600.58699: done checking for max_fail_percentage 15621 1726882600.58700: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.58700: done checking to see if all hosts have failed 15621 1726882600.58701: getting the remaining hosts for this loop 15621 1726882600.58702: done getting the remaining hosts for this loop 15621 1726882600.58705: getting the next task for host managed_node3 15621 1726882600.58715: done getting next task for host managed_node3 15621 1726882600.58716: ^ task is: TASK: meta (flush_handlers) 15621 1726882600.58718: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.58721: getting variables 15621 1726882600.58829: in VariableManager get_vars() 15621 1726882600.58841: Calling all_inventory to load vars for managed_node3 15621 1726882600.58844: Calling groups_inventory to load vars for managed_node3 15621 1726882600.58847: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.58852: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.58855: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.58858: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.61997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.64847: done with get_vars() 15621 1726882600.64880: done getting variables 15621 1726882600.64944: in VariableManager get_vars() 15621 1726882600.64955: Calling all_inventory to load vars for managed_node3 15621 1726882600.64958: Calling groups_inventory to load vars for managed_node3 15621 1726882600.64961: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.64966: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.64969: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.64975: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.66449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.68643: done with get_vars() 15621 1726882600.68669: done queuing things up, now waiting for results queue to drain 15621 1726882600.68674: results queue empty 15621 1726882600.68675: checking for any_errors_fatal 15621 1726882600.68676: done checking for any_errors_fatal 15621 1726882600.68677: checking for max_fail_percentage 15621 1726882600.68679: done checking for max_fail_percentage 15621 1726882600.68679: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.68680: done checking to see if all hosts have failed 15621 1726882600.68681: getting the remaining hosts for this loop 15621 1726882600.68682: done getting the remaining hosts for this loop 15621 1726882600.68685: getting the next task for host managed_node3 15621 1726882600.68688: done getting next task for host managed_node3 15621 1726882600.68689: ^ task is: None 15621 1726882600.68690: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.68691: done queuing things up, now waiting for results queue to drain 15621 1726882600.68692: results queue empty 15621 1726882600.68693: checking for any_errors_fatal 15621 1726882600.68694: done checking for any_errors_fatal 15621 1726882600.68694: checking for max_fail_percentage 15621 1726882600.68695: done checking for max_fail_percentage 15621 1726882600.68696: checking to see if all hosts have failed and the running result is not ok 15621 1726882600.68697: done checking to see if all hosts have failed 15621 1726882600.68698: getting the next task for host managed_node3 15621 1726882600.68700: done getting next task for host managed_node3 15621 1726882600.68701: ^ task is: None 15621 1726882600.68702: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.68746: in VariableManager get_vars() 15621 1726882600.68774: done with get_vars() 15621 1726882600.68781: in VariableManager get_vars() 15621 1726882600.68796: done with get_vars() 15621 1726882600.68800: variable 'omit' from source: magic vars 15621 1726882600.68934: variable 'profile' from source: play vars 15621 1726882600.69063: in VariableManager get_vars() 15621 1726882600.69082: done with get_vars() 15621 1726882600.69104: variable 'omit' from source: magic vars 15621 1726882600.69178: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15621 1726882600.69928: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882600.70113: getting the remaining hosts for this loop 15621 1726882600.70115: done getting the remaining hosts for this loop 15621 1726882600.70117: getting the next task for host managed_node3 15621 1726882600.70120: done getting next task for host managed_node3 15621 1726882600.70126: ^ task is: TASK: Gathering Facts 15621 1726882600.70127: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882600.70129: getting variables 15621 1726882600.70130: in VariableManager get_vars() 15621 1726882600.70142: Calling all_inventory to load vars for managed_node3 15621 1726882600.70144: Calling groups_inventory to load vars for managed_node3 15621 1726882600.70147: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882600.70152: Calling all_plugins_play to load vars for managed_node3 15621 1726882600.70155: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882600.70158: Calling groups_plugins_play to load vars for managed_node3 15621 1726882600.75977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882600.78099: done with get_vars() 15621 1726882600.78127: done getting variables 15621 1726882600.78182: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:36:40 -0400 (0:00:00.319) 0:00:32.861 ****** 15621 1726882600.78209: entering _queue_task() for managed_node3/gather_facts 15621 1726882600.78755: worker is 1 (out of 1 available) 15621 1726882600.78765: exiting _queue_task() for managed_node3/gather_facts 15621 1726882600.78774: done queuing things up, now waiting for results queue to drain 15621 1726882600.78775: waiting for pending results... 15621 1726882600.78904: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882600.78982: in run() - task 0affc7ec-ae25-af1a-5b92-00000000032b 15621 1726882600.79012: variable 'ansible_search_path' from source: unknown 15621 1726882600.79060: calling self._execute() 15621 1726882600.79168: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.79220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.79226: variable 'omit' from source: magic vars 15621 1726882600.79617: variable 'ansible_distribution_major_version' from source: facts 15621 1726882600.79639: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882600.79659: variable 'omit' from source: magic vars 15621 1726882600.79768: variable 'omit' from source: magic vars 15621 1726882600.79772: variable 'omit' from source: magic vars 15621 1726882600.79786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882600.79832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882600.79857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882600.79892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882600.79910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882600.79951: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882600.79960: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.79969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.80098: Set connection var ansible_connection to ssh 15621 1726882600.80115: Set connection var ansible_shell_executable to /bin/sh 15621 1726882600.80130: Set connection var ansible_timeout to 10 15621 1726882600.80201: Set connection var ansible_shell_type to sh 15621 1726882600.80205: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882600.80208: Set connection var ansible_pipelining to False 15621 1726882600.80211: variable 'ansible_shell_executable' from source: unknown 15621 1726882600.80214: variable 'ansible_connection' from source: unknown 15621 1726882600.80216: variable 'ansible_module_compression' from source: unknown 15621 1726882600.80219: variable 'ansible_shell_type' from source: unknown 15621 1726882600.80221: variable 'ansible_shell_executable' from source: unknown 15621 1726882600.80225: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882600.80228: variable 'ansible_pipelining' from source: unknown 15621 1726882600.80230: variable 'ansible_timeout' from source: unknown 15621 1726882600.80238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882600.80450: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882600.80469: variable 'omit' from source: magic vars 15621 1726882600.80481: starting attempt loop 15621 1726882600.80488: running the handler 15621 1726882600.80509: variable 'ansible_facts' from source: unknown 15621 1726882600.80643: _low_level_execute_command(): starting 15621 1726882600.80647: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882600.81428: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882600.81471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882600.81490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.81519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.81653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.83434: stdout chunk (state=3): >>>/root <<< 15621 1726882600.83584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882600.83620: stderr chunk (state=3): >>><<< 15621 1726882600.83635: stdout chunk (state=3): >>><<< 15621 1726882600.83665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882600.83689: _low_level_execute_command(): starting 15621 1726882600.83755: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480 `" && echo ansible-tmp-1726882600.836741-16866-131392249667480="` echo /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480 `" ) && sleep 0' 15621 1726882600.84291: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882600.84307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882600.84321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882600.84435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.84469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.84586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.86582: stdout chunk (state=3): >>>ansible-tmp-1726882600.836741-16866-131392249667480=/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480 <<< 15621 1726882600.86728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882600.86777: stderr chunk (state=3): >>><<< 15621 1726882600.86794: stdout chunk (state=3): >>><<< 15621 1726882600.86848: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882600.836741-16866-131392249667480=/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882600.86852: variable 'ansible_module_compression' from source: unknown 15621 1726882600.86904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882600.86986: variable 'ansible_facts' from source: unknown 15621 1726882600.87156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py 15621 1726882600.87350: Sending initial data 15621 1726882600.87354: Sent initial data (153 bytes) 15621 1726882600.88101: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882600.88116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.88167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.88259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.89862: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882600.89974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882600.90076: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvau_2c_9 /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py <<< 15621 1726882600.90079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py" <<< 15621 1726882600.90152: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvau_2c_9" to remote "/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py" <<< 15621 1726882600.91995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882600.92033: stderr chunk (state=3): >>><<< 15621 1726882600.92036: stdout chunk (state=3): >>><<< 15621 1726882600.92056: done transferring module to remote 15621 1726882600.92076: _low_level_execute_command(): starting 15621 1726882600.92158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/ /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py && sleep 0' 15621 1726882600.92750: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882600.92777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882600.92839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882600.92911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882600.92935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.92974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.93053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882600.94991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882600.94994: stdout chunk (state=3): >>><<< 15621 1726882600.94996: stderr chunk (state=3): >>><<< 15621 1726882600.95095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882600.95103: _low_level_execute_command(): starting 15621 1726882600.95111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/AnsiballZ_setup.py && sleep 0' 15621 1726882600.95699: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882600.95714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882600.95731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882600.95747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882600.95789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882600.95883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882600.95929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882600.96030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.03126: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3104, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 612, "free": 3104}, "nocache": {"free": 3487, "used": 229}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 747, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384270848, "block_size": 4096, "block_total": 64483404, "block_available": 61373113, "block_used": 3110291, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "42", "epoch": "1726882602", "epoch_int": "1726882602", "date": "2024-09-20", "time": "21:36:42", "iso8601_micro": "2024-09-21T01:36:42.982538Z", "iso8601": "2024-09-21T01:36:42Z", "iso8601_basic": "20240920T213642982538", "iso8601_basic_short": "20240920T213642", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.68798828125, "5m": 0.63916015625, "15m": 0.326171875}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882603.05128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882603.05329: stderr chunk (state=3): >>><<< 15621 1726882603.05333: stdout chunk (state=3): >>><<< 15621 1726882603.05337: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3104, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 612, "free": 3104}, "nocache": {"free": 3487, "used": 229}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 747, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384270848, "block_size": 4096, "block_total": 64483404, "block_available": 61373113, "block_used": 3110291, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "42", "epoch": "1726882602", "epoch_int": "1726882602", "date": "2024-09-20", "time": "21:36:42", "iso8601_micro": "2024-09-21T01:36:42.982538Z", "iso8601": "2024-09-21T01:36:42Z", "iso8601_basic": "20240920T213642982538", "iso8601_basic_short": "20240920T213642", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.68798828125, "5m": 0.63916015625, "15m": 0.326171875}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882603.05629: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882603.05649: _low_level_execute_command(): starting 15621 1726882603.05658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882600.836741-16866-131392249667480/ > /dev/null 2>&1 && sleep 0' 15621 1726882603.06325: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882603.06440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.06503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.06652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.08834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882603.08838: stdout chunk (state=3): >>><<< 15621 1726882603.08841: stderr chunk (state=3): >>><<< 15621 1726882603.08859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882603.08873: handler run complete 15621 1726882603.09286: variable 'ansible_facts' from source: unknown 15621 1726882603.09513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.10284: variable 'ansible_facts' from source: unknown 15621 1726882603.10582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.10908: attempt loop complete, returning result 15621 1726882603.10912: _execute() done 15621 1726882603.10914: dumping result to json 15621 1726882603.10941: done dumping result, returning 15621 1726882603.10954: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-00000000032b] 15621 1726882603.10964: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000032b ok: [managed_node3] 15621 1726882603.11899: no more pending results, returning what we have 15621 1726882603.11903: results queue empty 15621 1726882603.11904: checking for any_errors_fatal 15621 1726882603.11905: done checking for any_errors_fatal 15621 1726882603.11906: checking for max_fail_percentage 15621 1726882603.11907: done checking for max_fail_percentage 15621 1726882603.11909: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.11911: done checking to see if all hosts have failed 15621 1726882603.11911: getting the remaining hosts for this loop 15621 1726882603.11913: done getting the remaining hosts for this loop 15621 1726882603.11918: getting the next task for host managed_node3 15621 1726882603.12041: done getting next task for host managed_node3 15621 1726882603.12044: ^ task is: TASK: meta (flush_handlers) 15621 1726882603.12046: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.12051: getting variables 15621 1726882603.12052: in VariableManager get_vars() 15621 1726882603.12088: Calling all_inventory to load vars for managed_node3 15621 1726882603.12091: Calling groups_inventory to load vars for managed_node3 15621 1726882603.12093: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.12108: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.12111: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.12114: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.12739: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000032b 15621 1726882603.12742: WORKER PROCESS EXITING 15621 1726882603.14048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.16239: done with get_vars() 15621 1726882603.16270: done getting variables 15621 1726882603.16344: in VariableManager get_vars() 15621 1726882603.16357: Calling all_inventory to load vars for managed_node3 15621 1726882603.16360: Calling groups_inventory to load vars for managed_node3 15621 1726882603.16362: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.16367: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.16369: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.16372: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.17974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.20070: done with get_vars() 15621 1726882603.20103: done queuing things up, now waiting for results queue to drain 15621 1726882603.20105: results queue empty 15621 1726882603.20106: checking for any_errors_fatal 15621 1726882603.20110: done checking for any_errors_fatal 15621 1726882603.20111: checking for max_fail_percentage 15621 1726882603.20113: done checking for max_fail_percentage 15621 1726882603.20113: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.20114: done checking to see if all hosts have failed 15621 1726882603.20119: getting the remaining hosts for this loop 15621 1726882603.20120: done getting the remaining hosts for this loop 15621 1726882603.20126: getting the next task for host managed_node3 15621 1726882603.20130: done getting next task for host managed_node3 15621 1726882603.20134: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882603.20135: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.20146: getting variables 15621 1726882603.20147: in VariableManager get_vars() 15621 1726882603.20162: Calling all_inventory to load vars for managed_node3 15621 1726882603.20165: Calling groups_inventory to load vars for managed_node3 15621 1726882603.20173: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.20179: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.20182: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.20185: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.21678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.23815: done with get_vars() 15621 1726882603.23841: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:43 -0400 (0:00:02.457) 0:00:35.318 ****** 15621 1726882603.23940: entering _queue_task() for managed_node3/include_tasks 15621 1726882603.24537: worker is 1 (out of 1 available) 15621 1726882603.24548: exiting _queue_task() for managed_node3/include_tasks 15621 1726882603.24558: done queuing things up, now waiting for results queue to drain 15621 1726882603.24560: waiting for pending results... 15621 1726882603.24679: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882603.24829: in run() - task 0affc7ec-ae25-af1a-5b92-00000000003c 15621 1726882603.24854: variable 'ansible_search_path' from source: unknown 15621 1726882603.24863: variable 'ansible_search_path' from source: unknown 15621 1726882603.25005: calling self._execute() 15621 1726882603.25036: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.25048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.25062: variable 'omit' from source: magic vars 15621 1726882603.25486: variable 'ansible_distribution_major_version' from source: facts 15621 1726882603.25504: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882603.25516: _execute() done 15621 1726882603.25528: dumping result to json 15621 1726882603.25537: done dumping result, returning 15621 1726882603.25557: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-af1a-5b92-00000000003c] 15621 1726882603.25570: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003c 15621 1726882603.25806: no more pending results, returning what we have 15621 1726882603.25812: in VariableManager get_vars() 15621 1726882603.25861: Calling all_inventory to load vars for managed_node3 15621 1726882603.25864: Calling groups_inventory to load vars for managed_node3 15621 1726882603.25866: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.25935: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003c 15621 1726882603.25939: WORKER PROCESS EXITING 15621 1726882603.25994: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.25998: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.26002: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.27927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.30598: done with get_vars() 15621 1726882603.30627: variable 'ansible_search_path' from source: unknown 15621 1726882603.30629: variable 'ansible_search_path' from source: unknown 15621 1726882603.30665: we have included files to process 15621 1726882603.30667: generating all_blocks data 15621 1726882603.30668: done generating all_blocks data 15621 1726882603.30669: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882603.30670: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882603.30673: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882603.31335: done processing included file 15621 1726882603.31337: iterating over new_blocks loaded from include file 15621 1726882603.31339: in VariableManager get_vars() 15621 1726882603.31362: done with get_vars() 15621 1726882603.31364: filtering new block on tags 15621 1726882603.31381: done filtering new block on tags 15621 1726882603.31385: in VariableManager get_vars() 15621 1726882603.31411: done with get_vars() 15621 1726882603.31413: filtering new block on tags 15621 1726882603.31435: done filtering new block on tags 15621 1726882603.31438: in VariableManager get_vars() 15621 1726882603.31460: done with get_vars() 15621 1726882603.31462: filtering new block on tags 15621 1726882603.31479: done filtering new block on tags 15621 1726882603.31482: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15621 1726882603.31488: extending task lists for all hosts with included blocks 15621 1726882603.31934: done extending task lists 15621 1726882603.31936: done processing included files 15621 1726882603.31937: results queue empty 15621 1726882603.31938: checking for any_errors_fatal 15621 1726882603.31939: done checking for any_errors_fatal 15621 1726882603.31940: checking for max_fail_percentage 15621 1726882603.31941: done checking for max_fail_percentage 15621 1726882603.31942: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.31943: done checking to see if all hosts have failed 15621 1726882603.31943: getting the remaining hosts for this loop 15621 1726882603.31945: done getting the remaining hosts for this loop 15621 1726882603.31947: getting the next task for host managed_node3 15621 1726882603.31956: done getting next task for host managed_node3 15621 1726882603.31959: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882603.31961: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.31971: getting variables 15621 1726882603.31972: in VariableManager get_vars() 15621 1726882603.31986: Calling all_inventory to load vars for managed_node3 15621 1726882603.31988: Calling groups_inventory to load vars for managed_node3 15621 1726882603.31991: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.31996: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.31999: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.32002: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.33600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.35746: done with get_vars() 15621 1726882603.35774: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:43 -0400 (0:00:00.119) 0:00:35.437 ****** 15621 1726882603.35853: entering _queue_task() for managed_node3/setup 15621 1726882603.36272: worker is 1 (out of 1 available) 15621 1726882603.36285: exiting _queue_task() for managed_node3/setup 15621 1726882603.36299: done queuing things up, now waiting for results queue to drain 15621 1726882603.36301: waiting for pending results... 15621 1726882603.36576: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882603.36739: in run() - task 0affc7ec-ae25-af1a-5b92-00000000036c 15621 1726882603.36860: variable 'ansible_search_path' from source: unknown 15621 1726882603.36864: variable 'ansible_search_path' from source: unknown 15621 1726882603.36867: calling self._execute() 15621 1726882603.36918: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.36934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.36950: variable 'omit' from source: magic vars 15621 1726882603.37353: variable 'ansible_distribution_major_version' from source: facts 15621 1726882603.37370: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882603.37619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882603.40131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882603.40135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882603.40138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882603.40141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882603.40143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882603.40187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882603.40225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882603.40265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882603.40315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882603.40339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882603.40410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882603.40443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882603.40481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882603.40532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882603.40554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882603.40751: variable '__network_required_facts' from source: role '' defaults 15621 1726882603.40766: variable 'ansible_facts' from source: unknown 15621 1726882603.42127: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15621 1726882603.42133: when evaluation is False, skipping this task 15621 1726882603.42136: _execute() done 15621 1726882603.42139: dumping result to json 15621 1726882603.42141: done dumping result, returning 15621 1726882603.42144: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-af1a-5b92-00000000036c] 15621 1726882603.42147: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036c skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882603.42372: no more pending results, returning what we have 15621 1726882603.42377: results queue empty 15621 1726882603.42378: checking for any_errors_fatal 15621 1726882603.42380: done checking for any_errors_fatal 15621 1726882603.42382: checking for max_fail_percentage 15621 1726882603.42383: done checking for max_fail_percentage 15621 1726882603.42384: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.42385: done checking to see if all hosts have failed 15621 1726882603.42386: getting the remaining hosts for this loop 15621 1726882603.42388: done getting the remaining hosts for this loop 15621 1726882603.42393: getting the next task for host managed_node3 15621 1726882603.42403: done getting next task for host managed_node3 15621 1726882603.42408: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882603.42412: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.42435: getting variables 15621 1726882603.42437: in VariableManager get_vars() 15621 1726882603.42487: Calling all_inventory to load vars for managed_node3 15621 1726882603.42491: Calling groups_inventory to load vars for managed_node3 15621 1726882603.42494: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.42511: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.42517: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.42650: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.42664: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036c 15621 1726882603.42669: WORKER PROCESS EXITING 15621 1726882603.45185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.47944: done with get_vars() 15621 1726882603.47983: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:43 -0400 (0:00:00.122) 0:00:35.560 ****** 15621 1726882603.48096: entering _queue_task() for managed_node3/stat 15621 1726882603.48484: worker is 1 (out of 1 available) 15621 1726882603.48499: exiting _queue_task() for managed_node3/stat 15621 1726882603.48628: done queuing things up, now waiting for results queue to drain 15621 1726882603.48631: waiting for pending results... 15621 1726882603.48829: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882603.48994: in run() - task 0affc7ec-ae25-af1a-5b92-00000000036e 15621 1726882603.49016: variable 'ansible_search_path' from source: unknown 15621 1726882603.49026: variable 'ansible_search_path' from source: unknown 15621 1726882603.49079: calling self._execute() 15621 1726882603.49184: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.49198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.49274: variable 'omit' from source: magic vars 15621 1726882603.49623: variable 'ansible_distribution_major_version' from source: facts 15621 1726882603.49642: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882603.49839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882603.50147: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882603.50203: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882603.50244: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882603.50293: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882603.50400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882603.50474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882603.50478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882603.50506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882603.50611: variable '__network_is_ostree' from source: set_fact 15621 1726882603.50625: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882603.50692: when evaluation is False, skipping this task 15621 1726882603.50696: _execute() done 15621 1726882603.50699: dumping result to json 15621 1726882603.50701: done dumping result, returning 15621 1726882603.50704: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-af1a-5b92-00000000036e] 15621 1726882603.50707: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036e 15621 1726882603.50782: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036e skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882603.51043: no more pending results, returning what we have 15621 1726882603.51047: results queue empty 15621 1726882603.51048: checking for any_errors_fatal 15621 1726882603.51053: done checking for any_errors_fatal 15621 1726882603.51054: checking for max_fail_percentage 15621 1726882603.51056: done checking for max_fail_percentage 15621 1726882603.51057: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.51058: done checking to see if all hosts have failed 15621 1726882603.51059: getting the remaining hosts for this loop 15621 1726882603.51060: done getting the remaining hosts for this loop 15621 1726882603.51063: getting the next task for host managed_node3 15621 1726882603.51069: done getting next task for host managed_node3 15621 1726882603.51072: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882603.51075: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.51089: getting variables 15621 1726882603.51091: in VariableManager get_vars() 15621 1726882603.51135: Calling all_inventory to load vars for managed_node3 15621 1726882603.51138: Calling groups_inventory to load vars for managed_node3 15621 1726882603.51141: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.51152: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.51155: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.51159: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.51876: WORKER PROCESS EXITING 15621 1726882603.53883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.56188: done with get_vars() 15621 1726882603.56216: done getting variables 15621 1726882603.56285: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:43 -0400 (0:00:00.082) 0:00:35.642 ****** 15621 1726882603.56320: entering _queue_task() for managed_node3/set_fact 15621 1726882603.56736: worker is 1 (out of 1 available) 15621 1726882603.56748: exiting _queue_task() for managed_node3/set_fact 15621 1726882603.56758: done queuing things up, now waiting for results queue to drain 15621 1726882603.56759: waiting for pending results... 15621 1726882603.57030: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882603.57192: in run() - task 0affc7ec-ae25-af1a-5b92-00000000036f 15621 1726882603.57213: variable 'ansible_search_path' from source: unknown 15621 1726882603.57222: variable 'ansible_search_path' from source: unknown 15621 1726882603.57277: calling self._execute() 15621 1726882603.57380: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.57394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.57409: variable 'omit' from source: magic vars 15621 1726882603.57833: variable 'ansible_distribution_major_version' from source: facts 15621 1726882603.57849: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882603.58051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882603.58355: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882603.58405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882603.58456: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882603.58501: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882603.58609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882603.58648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882603.58688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882603.58725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882603.58830: variable '__network_is_ostree' from source: set_fact 15621 1726882603.58843: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882603.58852: when evaluation is False, skipping this task 15621 1726882603.58865: _execute() done 15621 1726882603.58891: dumping result to json 15621 1726882603.58895: done dumping result, returning 15621 1726882603.58900: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-af1a-5b92-00000000036f] 15621 1726882603.58928: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036f 15621 1726882603.59045: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000036f 15621 1726882603.59048: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882603.59150: no more pending results, returning what we have 15621 1726882603.59154: results queue empty 15621 1726882603.59155: checking for any_errors_fatal 15621 1726882603.59164: done checking for any_errors_fatal 15621 1726882603.59165: checking for max_fail_percentage 15621 1726882603.59167: done checking for max_fail_percentage 15621 1726882603.59168: checking to see if all hosts have failed and the running result is not ok 15621 1726882603.59169: done checking to see if all hosts have failed 15621 1726882603.59170: getting the remaining hosts for this loop 15621 1726882603.59172: done getting the remaining hosts for this loop 15621 1726882603.59176: getting the next task for host managed_node3 15621 1726882603.59236: done getting next task for host managed_node3 15621 1726882603.59241: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882603.59245: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882603.59261: getting variables 15621 1726882603.59264: in VariableManager get_vars() 15621 1726882603.59448: Calling all_inventory to load vars for managed_node3 15621 1726882603.59451: Calling groups_inventory to load vars for managed_node3 15621 1726882603.59453: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882603.59463: Calling all_plugins_play to load vars for managed_node3 15621 1726882603.59466: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882603.59469: Calling groups_plugins_play to load vars for managed_node3 15621 1726882603.61329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882603.63493: done with get_vars() 15621 1726882603.63519: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:43 -0400 (0:00:00.073) 0:00:35.715 ****** 15621 1726882603.63627: entering _queue_task() for managed_node3/service_facts 15621 1726882603.63979: worker is 1 (out of 1 available) 15621 1726882603.63993: exiting _queue_task() for managed_node3/service_facts 15621 1726882603.64005: done queuing things up, now waiting for results queue to drain 15621 1726882603.64006: waiting for pending results... 15621 1726882603.64451: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882603.64459: in run() - task 0affc7ec-ae25-af1a-5b92-000000000371 15621 1726882603.64482: variable 'ansible_search_path' from source: unknown 15621 1726882603.64491: variable 'ansible_search_path' from source: unknown 15621 1726882603.64544: calling self._execute() 15621 1726882603.64636: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.64759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.64763: variable 'omit' from source: magic vars 15621 1726882603.65072: variable 'ansible_distribution_major_version' from source: facts 15621 1726882603.65099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882603.65110: variable 'omit' from source: magic vars 15621 1726882603.65178: variable 'omit' from source: magic vars 15621 1726882603.65232: variable 'omit' from source: magic vars 15621 1726882603.65278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882603.65334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882603.65357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882603.65381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882603.65398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882603.65444: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882603.65521: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.65526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.65579: Set connection var ansible_connection to ssh 15621 1726882603.65603: Set connection var ansible_shell_executable to /bin/sh 15621 1726882603.65629: Set connection var ansible_timeout to 10 15621 1726882603.65632: Set connection var ansible_shell_type to sh 15621 1726882603.65727: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882603.65731: Set connection var ansible_pipelining to False 15621 1726882603.65736: variable 'ansible_shell_executable' from source: unknown 15621 1726882603.65738: variable 'ansible_connection' from source: unknown 15621 1726882603.65741: variable 'ansible_module_compression' from source: unknown 15621 1726882603.65743: variable 'ansible_shell_type' from source: unknown 15621 1726882603.65745: variable 'ansible_shell_executable' from source: unknown 15621 1726882603.65747: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882603.65749: variable 'ansible_pipelining' from source: unknown 15621 1726882603.65751: variable 'ansible_timeout' from source: unknown 15621 1726882603.65753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882603.65980: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882603.65984: variable 'omit' from source: magic vars 15621 1726882603.65987: starting attempt loop 15621 1726882603.65989: running the handler 15621 1726882603.65991: _low_level_execute_command(): starting 15621 1726882603.65994: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882603.66853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882603.66858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882603.66890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882603.66910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.66927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.67130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.68840: stdout chunk (state=3): >>>/root <<< 15621 1726882603.68994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882603.69242: stderr chunk (state=3): >>><<< 15621 1726882603.69245: stdout chunk (state=3): >>><<< 15621 1726882603.69248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882603.69255: _low_level_execute_command(): starting 15621 1726882603.69266: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579 `" && echo ansible-tmp-1726882603.692182-16940-240657676657579="` echo /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579 `" ) && sleep 0' 15621 1726882603.71194: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882603.71315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.71355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.71451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.73581: stdout chunk (state=3): >>>ansible-tmp-1726882603.692182-16940-240657676657579=/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579 <<< 15621 1726882603.73663: stdout chunk (state=3): >>><<< 15621 1726882603.73927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882603.73931: stderr chunk (state=3): >>><<< 15621 1726882603.73934: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882603.692182-16940-240657676657579=/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882603.73937: variable 'ansible_module_compression' from source: unknown 15621 1726882603.73939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15621 1726882603.73941: variable 'ansible_facts' from source: unknown 15621 1726882603.74181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py 15621 1726882603.74405: Sending initial data 15621 1726882603.74414: Sent initial data (161 bytes) 15621 1726882603.74919: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882603.74936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882603.74951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882603.75038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882603.75065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882603.75081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.75099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.75210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.76927: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882603.77041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882603.77047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py" <<< 15621 1726882603.77051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmptycd0cx4 /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py <<< 15621 1726882603.77154: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmptycd0cx4" to remote "/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py" <<< 15621 1726882603.77160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py" <<< 15621 1726882603.78733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882603.78841: stderr chunk (state=3): >>><<< 15621 1726882603.78846: stdout chunk (state=3): >>><<< 15621 1726882603.78995: done transferring module to remote 15621 1726882603.78999: _low_level_execute_command(): starting 15621 1726882603.79001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/ /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py && sleep 0' 15621 1726882603.80238: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882603.80281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882603.80292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.80311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.80454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882603.82452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882603.82455: stdout chunk (state=3): >>><<< 15621 1726882603.82458: stderr chunk (state=3): >>><<< 15621 1726882603.82460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882603.82463: _low_level_execute_command(): starting 15621 1726882603.82466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/AnsiballZ_service_facts.py && sleep 0' 15621 1726882603.83487: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882603.83833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882603.83848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882603.83865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882603.83981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882605.98760: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 15621 1726882605.98813: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.ser<<< 15621 1726882605.98833: stdout chunk (state=3): >>>vice": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15621 1726882606.00440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882606.00502: stderr chunk (state=3): >>><<< 15621 1726882606.00506: stdout chunk (state=3): >>><<< 15621 1726882606.00532: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882606.01084: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882606.01094: _low_level_execute_command(): starting 15621 1726882606.01099: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882603.692182-16940-240657676657579/ > /dev/null 2>&1 && sleep 0' 15621 1726882606.01597: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882606.01600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882606.01603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882606.01605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.01608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.01660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.01664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882606.01671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.01750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.03727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.03780: stderr chunk (state=3): >>><<< 15621 1726882606.03783: stdout chunk (state=3): >>><<< 15621 1726882606.03797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882606.03803: handler run complete 15621 1726882606.03948: variable 'ansible_facts' from source: unknown 15621 1726882606.04075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882606.04414: variable 'ansible_facts' from source: unknown 15621 1726882606.04513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882606.04663: attempt loop complete, returning result 15621 1726882606.04666: _execute() done 15621 1726882606.04671: dumping result to json 15621 1726882606.04717: done dumping result, returning 15621 1726882606.04727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-af1a-5b92-000000000371] 15621 1726882606.04733: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000371 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882606.05461: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000371 15621 1726882606.05465: WORKER PROCESS EXITING 15621 1726882606.05470: no more pending results, returning what we have 15621 1726882606.05472: results queue empty 15621 1726882606.05473: checking for any_errors_fatal 15621 1726882606.05476: done checking for any_errors_fatal 15621 1726882606.05477: checking for max_fail_percentage 15621 1726882606.05478: done checking for max_fail_percentage 15621 1726882606.05479: checking to see if all hosts have failed and the running result is not ok 15621 1726882606.05479: done checking to see if all hosts have failed 15621 1726882606.05480: getting the remaining hosts for this loop 15621 1726882606.05481: done getting the remaining hosts for this loop 15621 1726882606.05483: getting the next task for host managed_node3 15621 1726882606.05487: done getting next task for host managed_node3 15621 1726882606.05490: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882606.05492: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882606.05499: getting variables 15621 1726882606.05500: in VariableManager get_vars() 15621 1726882606.05526: Calling all_inventory to load vars for managed_node3 15621 1726882606.05528: Calling groups_inventory to load vars for managed_node3 15621 1726882606.05530: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882606.05537: Calling all_plugins_play to load vars for managed_node3 15621 1726882606.05539: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882606.05541: Calling groups_plugins_play to load vars for managed_node3 15621 1726882606.06526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882606.07683: done with get_vars() 15621 1726882606.07703: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:46 -0400 (0:00:02.441) 0:00:38.157 ****** 15621 1726882606.07784: entering _queue_task() for managed_node3/package_facts 15621 1726882606.08063: worker is 1 (out of 1 available) 15621 1726882606.08078: exiting _queue_task() for managed_node3/package_facts 15621 1726882606.08091: done queuing things up, now waiting for results queue to drain 15621 1726882606.08093: waiting for pending results... 15621 1726882606.08288: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882606.08389: in run() - task 0affc7ec-ae25-af1a-5b92-000000000372 15621 1726882606.08402: variable 'ansible_search_path' from source: unknown 15621 1726882606.08406: variable 'ansible_search_path' from source: unknown 15621 1726882606.08444: calling self._execute() 15621 1726882606.08513: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882606.08517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882606.08528: variable 'omit' from source: magic vars 15621 1726882606.08828: variable 'ansible_distribution_major_version' from source: facts 15621 1726882606.08837: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882606.08843: variable 'omit' from source: magic vars 15621 1726882606.08889: variable 'omit' from source: magic vars 15621 1726882606.08915: variable 'omit' from source: magic vars 15621 1726882606.08951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882606.08983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882606.09003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882606.09018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882606.09031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882606.09057: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882606.09060: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882606.09063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882606.09145: Set connection var ansible_connection to ssh 15621 1726882606.09153: Set connection var ansible_shell_executable to /bin/sh 15621 1726882606.09159: Set connection var ansible_timeout to 10 15621 1726882606.09162: Set connection var ansible_shell_type to sh 15621 1726882606.09167: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882606.09175: Set connection var ansible_pipelining to False 15621 1726882606.09194: variable 'ansible_shell_executable' from source: unknown 15621 1726882606.09199: variable 'ansible_connection' from source: unknown 15621 1726882606.09202: variable 'ansible_module_compression' from source: unknown 15621 1726882606.09205: variable 'ansible_shell_type' from source: unknown 15621 1726882606.09208: variable 'ansible_shell_executable' from source: unknown 15621 1726882606.09210: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882606.09212: variable 'ansible_pipelining' from source: unknown 15621 1726882606.09215: variable 'ansible_timeout' from source: unknown 15621 1726882606.09220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882606.09379: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882606.09384: variable 'omit' from source: magic vars 15621 1726882606.09391: starting attempt loop 15621 1726882606.09394: running the handler 15621 1726882606.09410: _low_level_execute_command(): starting 15621 1726882606.09413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882606.09953: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.09957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.09962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.09964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.10021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.10030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882606.10032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.10114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.11857: stdout chunk (state=3): >>>/root <<< 15621 1726882606.11957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.12027: stderr chunk (state=3): >>><<< 15621 1726882606.12031: stdout chunk (state=3): >>><<< 15621 1726882606.12041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882606.12054: _low_level_execute_command(): starting 15621 1726882606.12061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964 `" && echo ansible-tmp-1726882606.1204152-16982-219916775588964="` echo /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964 `" ) && sleep 0' 15621 1726882606.12536: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.12539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.12552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882606.12555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882606.12557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.12600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.12606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882606.12608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.12693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.14684: stdout chunk (state=3): >>>ansible-tmp-1726882606.1204152-16982-219916775588964=/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964 <<< 15621 1726882606.14803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.14858: stderr chunk (state=3): >>><<< 15621 1726882606.14861: stdout chunk (state=3): >>><<< 15621 1726882606.14877: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882606.1204152-16982-219916775588964=/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882606.14916: variable 'ansible_module_compression' from source: unknown 15621 1726882606.14958: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15621 1726882606.15021: variable 'ansible_facts' from source: unknown 15621 1726882606.15129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py 15621 1726882606.15247: Sending initial data 15621 1726882606.15250: Sent initial data (162 bytes) 15621 1726882606.15728: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.15731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.15734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882606.15736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.15785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.15789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.15878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.17495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882606.17575: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882606.17662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpyog8cilf /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py <<< 15621 1726882606.17666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py" <<< 15621 1726882606.17742: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpyog8cilf" to remote "/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py" <<< 15621 1726882606.17748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py" <<< 15621 1726882606.19094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.19164: stderr chunk (state=3): >>><<< 15621 1726882606.19167: stdout chunk (state=3): >>><<< 15621 1726882606.19190: done transferring module to remote 15621 1726882606.19204: _low_level_execute_command(): starting 15621 1726882606.19227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/ /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py && sleep 0' 15621 1726882606.19696: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882606.19699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.19701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882606.19708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882606.19710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.19754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.19758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.19847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.21707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.21755: stderr chunk (state=3): >>><<< 15621 1726882606.21758: stdout chunk (state=3): >>><<< 15621 1726882606.21771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882606.21777: _low_level_execute_command(): starting 15621 1726882606.21783: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/AnsiballZ_package_facts.py && sleep 0' 15621 1726882606.22257: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882606.22260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882606.22263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.22265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882606.22269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.22318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.22327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.22416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.85157: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 15621 1726882606.85285: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "sourc<<< 15621 1726882606.85317: stdout chunk (state=3): >>>e": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch":<<< 15621 1726882606.85352: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15621 1726882606.87203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.87318: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882606.87369: stderr chunk (state=3): >>><<< 15621 1726882606.87375: stdout chunk (state=3): >>><<< 15621 1726882606.87439: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882606.89918: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882606.89927: _low_level_execute_command(): starting 15621 1726882606.89933: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882606.1204152-16982-219916775588964/ > /dev/null 2>&1 && sleep 0' 15621 1726882606.90537: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882606.90586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882606.90592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882606.90677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882606.92645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882606.92697: stderr chunk (state=3): >>><<< 15621 1726882606.92699: stdout chunk (state=3): >>><<< 15621 1726882606.92708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882606.92728: handler run complete 15621 1726882606.93441: variable 'ansible_facts' from source: unknown 15621 1726882606.93893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882606.96642: variable 'ansible_facts' from source: unknown 15621 1726882606.97145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882606.97963: attempt loop complete, returning result 15621 1726882606.97977: _execute() done 15621 1726882606.97982: dumping result to json 15621 1726882606.98173: done dumping result, returning 15621 1726882606.98183: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-af1a-5b92-000000000372] 15621 1726882606.98189: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000372 15621 1726882607.00105: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000372 15621 1726882607.00109: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882607.00230: no more pending results, returning what we have 15621 1726882607.00233: results queue empty 15621 1726882607.00233: checking for any_errors_fatal 15621 1726882607.00238: done checking for any_errors_fatal 15621 1726882607.00242: checking for max_fail_percentage 15621 1726882607.00244: done checking for max_fail_percentage 15621 1726882607.00245: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.00246: done checking to see if all hosts have failed 15621 1726882607.00246: getting the remaining hosts for this loop 15621 1726882607.00250: done getting the remaining hosts for this loop 15621 1726882607.00254: getting the next task for host managed_node3 15621 1726882607.00261: done getting next task for host managed_node3 15621 1726882607.00265: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882607.00267: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.00275: getting variables 15621 1726882607.00277: in VariableManager get_vars() 15621 1726882607.00304: Calling all_inventory to load vars for managed_node3 15621 1726882607.00306: Calling groups_inventory to load vars for managed_node3 15621 1726882607.00308: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.00321: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.00328: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.00333: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.01418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.02858: done with get_vars() 15621 1726882607.02878: done getting variables 15621 1726882607.02950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:47 -0400 (0:00:00.951) 0:00:39.109 ****** 15621 1726882607.02986: entering _queue_task() for managed_node3/debug 15621 1726882607.03288: worker is 1 (out of 1 available) 15621 1726882607.03302: exiting _queue_task() for managed_node3/debug 15621 1726882607.03313: done queuing things up, now waiting for results queue to drain 15621 1726882607.03316: waiting for pending results... 15621 1726882607.03519: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882607.03617: in run() - task 0affc7ec-ae25-af1a-5b92-00000000003d 15621 1726882607.03638: variable 'ansible_search_path' from source: unknown 15621 1726882607.03644: variable 'ansible_search_path' from source: unknown 15621 1726882607.03694: calling self._execute() 15621 1726882607.03787: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.03794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.03802: variable 'omit' from source: magic vars 15621 1726882607.04090: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.04099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.04103: variable 'omit' from source: magic vars 15621 1726882607.04139: variable 'omit' from source: magic vars 15621 1726882607.04208: variable 'network_provider' from source: set_fact 15621 1726882607.04227: variable 'omit' from source: magic vars 15621 1726882607.04260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882607.04290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882607.04309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882607.04324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882607.04338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882607.04381: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882607.04384: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.04387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.04461: Set connection var ansible_connection to ssh 15621 1726882607.04469: Set connection var ansible_shell_executable to /bin/sh 15621 1726882607.04476: Set connection var ansible_timeout to 10 15621 1726882607.04479: Set connection var ansible_shell_type to sh 15621 1726882607.04482: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882607.04488: Set connection var ansible_pipelining to False 15621 1726882607.04507: variable 'ansible_shell_executable' from source: unknown 15621 1726882607.04511: variable 'ansible_connection' from source: unknown 15621 1726882607.04513: variable 'ansible_module_compression' from source: unknown 15621 1726882607.04516: variable 'ansible_shell_type' from source: unknown 15621 1726882607.04520: variable 'ansible_shell_executable' from source: unknown 15621 1726882607.04523: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.04525: variable 'ansible_pipelining' from source: unknown 15621 1726882607.04528: variable 'ansible_timeout' from source: unknown 15621 1726882607.04534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.04642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882607.04652: variable 'omit' from source: magic vars 15621 1726882607.04655: starting attempt loop 15621 1726882607.04658: running the handler 15621 1726882607.04698: handler run complete 15621 1726882607.04709: attempt loop complete, returning result 15621 1726882607.04712: _execute() done 15621 1726882607.04715: dumping result to json 15621 1726882607.04718: done dumping result, returning 15621 1726882607.04727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-af1a-5b92-00000000003d] 15621 1726882607.04733: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003d 15621 1726882607.04818: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003d 15621 1726882607.04823: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 15621 1726882607.04884: no more pending results, returning what we have 15621 1726882607.04887: results queue empty 15621 1726882607.04888: checking for any_errors_fatal 15621 1726882607.04895: done checking for any_errors_fatal 15621 1726882607.04896: checking for max_fail_percentage 15621 1726882607.04897: done checking for max_fail_percentage 15621 1726882607.04898: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.04899: done checking to see if all hosts have failed 15621 1726882607.04900: getting the remaining hosts for this loop 15621 1726882607.04901: done getting the remaining hosts for this loop 15621 1726882607.04905: getting the next task for host managed_node3 15621 1726882607.04910: done getting next task for host managed_node3 15621 1726882607.04914: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882607.04915: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.04931: getting variables 15621 1726882607.04933: in VariableManager get_vars() 15621 1726882607.04968: Calling all_inventory to load vars for managed_node3 15621 1726882607.04970: Calling groups_inventory to load vars for managed_node3 15621 1726882607.04975: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.04984: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.04987: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.04989: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.06180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.07564: done with get_vars() 15621 1726882607.07582: done getting variables 15621 1726882607.07629: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:47 -0400 (0:00:00.046) 0:00:39.155 ****** 15621 1726882607.07652: entering _queue_task() for managed_node3/fail 15621 1726882607.07866: worker is 1 (out of 1 available) 15621 1726882607.07880: exiting _queue_task() for managed_node3/fail 15621 1726882607.07892: done queuing things up, now waiting for results queue to drain 15621 1726882607.07894: waiting for pending results... 15621 1726882607.08201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882607.08269: in run() - task 0affc7ec-ae25-af1a-5b92-00000000003e 15621 1726882607.08274: variable 'ansible_search_path' from source: unknown 15621 1726882607.08278: variable 'ansible_search_path' from source: unknown 15621 1726882607.08307: calling self._execute() 15621 1726882607.08391: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.08395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.08413: variable 'omit' from source: magic vars 15621 1726882607.08826: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.08841: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.08977: variable 'network_state' from source: role '' defaults 15621 1726882607.08981: Evaluated conditional (network_state != {}): False 15621 1726882607.08984: when evaluation is False, skipping this task 15621 1726882607.08987: _execute() done 15621 1726882607.08992: dumping result to json 15621 1726882607.08997: done dumping result, returning 15621 1726882607.09006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-af1a-5b92-00000000003e] 15621 1726882607.09018: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003e 15621 1726882607.09126: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003e 15621 1726882607.09130: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882607.09198: no more pending results, returning what we have 15621 1726882607.09201: results queue empty 15621 1726882607.09201: checking for any_errors_fatal 15621 1726882607.09206: done checking for any_errors_fatal 15621 1726882607.09206: checking for max_fail_percentage 15621 1726882607.09208: done checking for max_fail_percentage 15621 1726882607.09209: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.09210: done checking to see if all hosts have failed 15621 1726882607.09211: getting the remaining hosts for this loop 15621 1726882607.09212: done getting the remaining hosts for this loop 15621 1726882607.09215: getting the next task for host managed_node3 15621 1726882607.09219: done getting next task for host managed_node3 15621 1726882607.09224: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882607.09227: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.09244: getting variables 15621 1726882607.09246: in VariableManager get_vars() 15621 1726882607.09278: Calling all_inventory to load vars for managed_node3 15621 1726882607.09280: Calling groups_inventory to load vars for managed_node3 15621 1726882607.09283: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.09294: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.09298: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.09301: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.10390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.11621: done with get_vars() 15621 1726882607.11641: done getting variables 15621 1726882607.11686: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:47 -0400 (0:00:00.040) 0:00:39.196 ****** 15621 1726882607.11711: entering _queue_task() for managed_node3/fail 15621 1726882607.11932: worker is 1 (out of 1 available) 15621 1726882607.11947: exiting _queue_task() for managed_node3/fail 15621 1726882607.11958: done queuing things up, now waiting for results queue to drain 15621 1726882607.11960: waiting for pending results... 15621 1726882607.12158: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882607.12230: in run() - task 0affc7ec-ae25-af1a-5b92-00000000003f 15621 1726882607.12241: variable 'ansible_search_path' from source: unknown 15621 1726882607.12245: variable 'ansible_search_path' from source: unknown 15621 1726882607.12274: calling self._execute() 15621 1726882607.12348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.12354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.12363: variable 'omit' from source: magic vars 15621 1726882607.12653: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.12663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.12751: variable 'network_state' from source: role '' defaults 15621 1726882607.12761: Evaluated conditional (network_state != {}): False 15621 1726882607.12764: when evaluation is False, skipping this task 15621 1726882607.12767: _execute() done 15621 1726882607.12770: dumping result to json 15621 1726882607.12778: done dumping result, returning 15621 1726882607.12785: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-af1a-5b92-00000000003f] 15621 1726882607.12791: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003f 15621 1726882607.12882: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000003f 15621 1726882607.12886: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882607.12936: no more pending results, returning what we have 15621 1726882607.12940: results queue empty 15621 1726882607.12940: checking for any_errors_fatal 15621 1726882607.12945: done checking for any_errors_fatal 15621 1726882607.12946: checking for max_fail_percentage 15621 1726882607.12947: done checking for max_fail_percentage 15621 1726882607.12948: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.12949: done checking to see if all hosts have failed 15621 1726882607.12950: getting the remaining hosts for this loop 15621 1726882607.12951: done getting the remaining hosts for this loop 15621 1726882607.12954: getting the next task for host managed_node3 15621 1726882607.12959: done getting next task for host managed_node3 15621 1726882607.12963: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882607.12965: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.12981: getting variables 15621 1726882607.12982: in VariableManager get_vars() 15621 1726882607.13013: Calling all_inventory to load vars for managed_node3 15621 1726882607.13015: Calling groups_inventory to load vars for managed_node3 15621 1726882607.13018: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.13028: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.13031: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.13034: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.17716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.19769: done with get_vars() 15621 1726882607.19793: done getting variables 15621 1726882607.19847: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:47 -0400 (0:00:00.081) 0:00:39.277 ****** 15621 1726882607.19873: entering _queue_task() for managed_node3/fail 15621 1726882607.20210: worker is 1 (out of 1 available) 15621 1726882607.20226: exiting _queue_task() for managed_node3/fail 15621 1726882607.20238: done queuing things up, now waiting for results queue to drain 15621 1726882607.20239: waiting for pending results... 15621 1726882607.20644: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882607.20654: in run() - task 0affc7ec-ae25-af1a-5b92-000000000040 15621 1726882607.20673: variable 'ansible_search_path' from source: unknown 15621 1726882607.20680: variable 'ansible_search_path' from source: unknown 15621 1726882607.20726: calling self._execute() 15621 1726882607.20834: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.20956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.20960: variable 'omit' from source: magic vars 15621 1726882607.21273: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.21293: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.21476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.23917: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.24005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.24232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.24235: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.24239: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.24242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.24256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.24289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.24340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.24368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.24488: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.24511: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15621 1726882607.24649: variable 'ansible_distribution' from source: facts 15621 1726882607.24660: variable '__network_rh_distros' from source: role '' defaults 15621 1726882607.24673: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15621 1726882607.24687: when evaluation is False, skipping this task 15621 1726882607.24696: _execute() done 15621 1726882607.24704: dumping result to json 15621 1726882607.24713: done dumping result, returning 15621 1726882607.24728: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-af1a-5b92-000000000040] 15621 1726882607.24740: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000040 15621 1726882607.25031: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000040 15621 1726882607.25035: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15621 1726882607.25085: no more pending results, returning what we have 15621 1726882607.25089: results queue empty 15621 1726882607.25090: checking for any_errors_fatal 15621 1726882607.25100: done checking for any_errors_fatal 15621 1726882607.25101: checking for max_fail_percentage 15621 1726882607.25103: done checking for max_fail_percentage 15621 1726882607.25104: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.25105: done checking to see if all hosts have failed 15621 1726882607.25106: getting the remaining hosts for this loop 15621 1726882607.25107: done getting the remaining hosts for this loop 15621 1726882607.25111: getting the next task for host managed_node3 15621 1726882607.25117: done getting next task for host managed_node3 15621 1726882607.25123: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882607.25125: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.25140: getting variables 15621 1726882607.25141: in VariableManager get_vars() 15621 1726882607.25180: Calling all_inventory to load vars for managed_node3 15621 1726882607.25183: Calling groups_inventory to load vars for managed_node3 15621 1726882607.25185: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.25195: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.25197: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.25200: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.27098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.29201: done with get_vars() 15621 1726882607.29231: done getting variables 15621 1726882607.29298: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:47 -0400 (0:00:00.094) 0:00:39.372 ****** 15621 1726882607.29334: entering _queue_task() for managed_node3/dnf 15621 1726882607.29663: worker is 1 (out of 1 available) 15621 1726882607.29677: exiting _queue_task() for managed_node3/dnf 15621 1726882607.29689: done queuing things up, now waiting for results queue to drain 15621 1726882607.29691: waiting for pending results... 15621 1726882607.29980: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882607.30109: in run() - task 0affc7ec-ae25-af1a-5b92-000000000041 15621 1726882607.30136: variable 'ansible_search_path' from source: unknown 15621 1726882607.30149: variable 'ansible_search_path' from source: unknown 15621 1726882607.30197: calling self._execute() 15621 1726882607.30301: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.30314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.30332: variable 'omit' from source: magic vars 15621 1726882607.30748: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.30766: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.30989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.33511: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.33514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.33524: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.33567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.33597: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.33690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.33727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.33763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.33812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.33836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.33966: variable 'ansible_distribution' from source: facts 15621 1726882607.33976: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.33987: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15621 1726882607.34112: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882607.34260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.34387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.34391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.34393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.34396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.34434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.34463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.34497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.34544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.34562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.34614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.34645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.34675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.34725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.34744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.34911: variable 'network_connections' from source: play vars 15621 1726882607.34931: variable 'profile' from source: play vars 15621 1726882607.35005: variable 'profile' from source: play vars 15621 1726882607.35015: variable 'interface' from source: set_fact 15621 1726882607.35086: variable 'interface' from source: set_fact 15621 1726882607.35169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882607.35365: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882607.35474: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882607.35477: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882607.35480: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882607.35535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882607.35567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882607.35611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.35691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882607.35700: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882607.35964: variable 'network_connections' from source: play vars 15621 1726882607.35975: variable 'profile' from source: play vars 15621 1726882607.36050: variable 'profile' from source: play vars 15621 1726882607.36061: variable 'interface' from source: set_fact 15621 1726882607.36136: variable 'interface' from source: set_fact 15621 1726882607.36163: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882607.36329: when evaluation is False, skipping this task 15621 1726882607.36332: _execute() done 15621 1726882607.36334: dumping result to json 15621 1726882607.36336: done dumping result, returning 15621 1726882607.36338: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000041] 15621 1726882607.36340: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000041 15621 1726882607.36405: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000041 15621 1726882607.36408: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882607.36465: no more pending results, returning what we have 15621 1726882607.36469: results queue empty 15621 1726882607.36470: checking for any_errors_fatal 15621 1726882607.36477: done checking for any_errors_fatal 15621 1726882607.36478: checking for max_fail_percentage 15621 1726882607.36480: done checking for max_fail_percentage 15621 1726882607.36481: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.36482: done checking to see if all hosts have failed 15621 1726882607.36483: getting the remaining hosts for this loop 15621 1726882607.36484: done getting the remaining hosts for this loop 15621 1726882607.36489: getting the next task for host managed_node3 15621 1726882607.36496: done getting next task for host managed_node3 15621 1726882607.36500: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882607.36502: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.36520: getting variables 15621 1726882607.36525: in VariableManager get_vars() 15621 1726882607.36571: Calling all_inventory to load vars for managed_node3 15621 1726882607.36574: Calling groups_inventory to load vars for managed_node3 15621 1726882607.36576: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.36591: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.36594: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.36597: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.38488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.40562: done with get_vars() 15621 1726882607.40587: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882607.40668: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:47 -0400 (0:00:00.113) 0:00:39.486 ****** 15621 1726882607.40698: entering _queue_task() for managed_node3/yum 15621 1726882607.41021: worker is 1 (out of 1 available) 15621 1726882607.41037: exiting _queue_task() for managed_node3/yum 15621 1726882607.41050: done queuing things up, now waiting for results queue to drain 15621 1726882607.41051: waiting for pending results... 15621 1726882607.41352: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882607.41624: in run() - task 0affc7ec-ae25-af1a-5b92-000000000042 15621 1726882607.41632: variable 'ansible_search_path' from source: unknown 15621 1726882607.41636: variable 'ansible_search_path' from source: unknown 15621 1726882607.41639: calling self._execute() 15621 1726882607.41641: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.41644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.41646: variable 'omit' from source: magic vars 15621 1726882607.42029: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.42047: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.42250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.44998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.45084: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.45128: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.45172: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.45203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.45296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.45336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.45373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.45421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.45445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.45547: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.45572: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15621 1726882607.45580: when evaluation is False, skipping this task 15621 1726882607.45588: _execute() done 15621 1726882607.45597: dumping result to json 15621 1726882607.45605: done dumping result, returning 15621 1726882607.45616: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000042] 15621 1726882607.45628: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000042 15621 1726882607.45886: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000042 15621 1726882607.45890: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15621 1726882607.45940: no more pending results, returning what we have 15621 1726882607.45943: results queue empty 15621 1726882607.45944: checking for any_errors_fatal 15621 1726882607.45951: done checking for any_errors_fatal 15621 1726882607.45952: checking for max_fail_percentage 15621 1726882607.45953: done checking for max_fail_percentage 15621 1726882607.45954: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.45955: done checking to see if all hosts have failed 15621 1726882607.45956: getting the remaining hosts for this loop 15621 1726882607.45957: done getting the remaining hosts for this loop 15621 1726882607.45962: getting the next task for host managed_node3 15621 1726882607.45967: done getting next task for host managed_node3 15621 1726882607.45971: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882607.45973: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.45987: getting variables 15621 1726882607.45989: in VariableManager get_vars() 15621 1726882607.46028: Calling all_inventory to load vars for managed_node3 15621 1726882607.46031: Calling groups_inventory to load vars for managed_node3 15621 1726882607.46034: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.46045: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.46048: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.46051: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.47918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.50335: done with get_vars() 15621 1726882607.50376: done getting variables 15621 1726882607.50444: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:47 -0400 (0:00:00.097) 0:00:39.584 ****** 15621 1726882607.50485: entering _queue_task() for managed_node3/fail 15621 1726882607.50866: worker is 1 (out of 1 available) 15621 1726882607.50883: exiting _queue_task() for managed_node3/fail 15621 1726882607.50895: done queuing things up, now waiting for results queue to drain 15621 1726882607.50897: waiting for pending results... 15621 1726882607.51212: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882607.51336: in run() - task 0affc7ec-ae25-af1a-5b92-000000000043 15621 1726882607.51365: variable 'ansible_search_path' from source: unknown 15621 1726882607.51375: variable 'ansible_search_path' from source: unknown 15621 1726882607.51419: calling self._execute() 15621 1726882607.51528: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.51564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.51568: variable 'omit' from source: magic vars 15621 1726882607.51991: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.52014: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.52221: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882607.52404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.54956: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.55043: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.55096: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.55140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.55184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.55288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.55327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.55390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.55416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.55437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.55488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.55528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.55606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.55611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.55639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.55696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.55755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.55767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.55810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.55863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.56051: variable 'network_connections' from source: play vars 15621 1726882607.56068: variable 'profile' from source: play vars 15621 1726882607.56161: variable 'profile' from source: play vars 15621 1726882607.56176: variable 'interface' from source: set_fact 15621 1726882607.56299: variable 'interface' from source: set_fact 15621 1726882607.56363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882607.56742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882607.56790: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882607.56914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882607.56917: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882607.56930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882607.56963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882607.56999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.57036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882607.57097: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882607.57397: variable 'network_connections' from source: play vars 15621 1726882607.57408: variable 'profile' from source: play vars 15621 1726882607.57487: variable 'profile' from source: play vars 15621 1726882607.57498: variable 'interface' from source: set_fact 15621 1726882607.57566: variable 'interface' from source: set_fact 15621 1726882607.57728: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882607.57731: when evaluation is False, skipping this task 15621 1726882607.57733: _execute() done 15621 1726882607.57735: dumping result to json 15621 1726882607.57737: done dumping result, returning 15621 1726882607.57740: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000043] 15621 1726882607.57749: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000043 15621 1726882607.57825: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000043 15621 1726882607.57829: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882607.57887: no more pending results, returning what we have 15621 1726882607.57891: results queue empty 15621 1726882607.57892: checking for any_errors_fatal 15621 1726882607.57899: done checking for any_errors_fatal 15621 1726882607.57900: checking for max_fail_percentage 15621 1726882607.57901: done checking for max_fail_percentage 15621 1726882607.57902: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.57903: done checking to see if all hosts have failed 15621 1726882607.57904: getting the remaining hosts for this loop 15621 1726882607.57906: done getting the remaining hosts for this loop 15621 1726882607.57911: getting the next task for host managed_node3 15621 1726882607.57917: done getting next task for host managed_node3 15621 1726882607.57921: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15621 1726882607.57924: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.57941: getting variables 15621 1726882607.57943: in VariableManager get_vars() 15621 1726882607.57986: Calling all_inventory to load vars for managed_node3 15621 1726882607.57989: Calling groups_inventory to load vars for managed_node3 15621 1726882607.57992: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.58005: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.58008: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.58011: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.60148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.62336: done with get_vars() 15621 1726882607.62368: done getting variables 15621 1726882607.62437: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:47 -0400 (0:00:00.119) 0:00:39.703 ****** 15621 1726882607.62478: entering _queue_task() for managed_node3/package 15621 1726882607.62842: worker is 1 (out of 1 available) 15621 1726882607.62854: exiting _queue_task() for managed_node3/package 15621 1726882607.62866: done queuing things up, now waiting for results queue to drain 15621 1726882607.62868: waiting for pending results... 15621 1726882607.63340: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15621 1726882607.63350: in run() - task 0affc7ec-ae25-af1a-5b92-000000000044 15621 1726882607.63354: variable 'ansible_search_path' from source: unknown 15621 1726882607.63356: variable 'ansible_search_path' from source: unknown 15621 1726882607.63382: calling self._execute() 15621 1726882607.63488: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.63569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.63577: variable 'omit' from source: magic vars 15621 1726882607.63946: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.63962: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.64208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882607.64528: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882607.64582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882607.64658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882607.64717: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882607.64855: variable 'network_packages' from source: role '' defaults 15621 1726882607.64998: variable '__network_provider_setup' from source: role '' defaults 15621 1726882607.65093: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882607.65097: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882607.65115: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882607.65188: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882607.65417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.67735: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.67805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.67857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.67900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.67948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.68128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.68132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.68135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.68167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.68190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.68244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.68286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.68320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.68375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.68401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.68643: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882607.68778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.68810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.68854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.68932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.68935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.69036: variable 'ansible_python' from source: facts 15621 1726882607.69078: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882607.69229: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882607.69280: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882607.69428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.69465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.69502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.69559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.69584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.69643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.69735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.69738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.69777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.69798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.69977: variable 'network_connections' from source: play vars 15621 1726882607.70029: variable 'profile' from source: play vars 15621 1726882607.70118: variable 'profile' from source: play vars 15621 1726882607.70134: variable 'interface' from source: set_fact 15621 1726882607.70221: variable 'interface' from source: set_fact 15621 1726882607.70314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882607.70399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882607.70402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.70439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882607.70494: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882607.70859: variable 'network_connections' from source: play vars 15621 1726882607.70868: variable 'profile' from source: play vars 15621 1726882607.70977: variable 'profile' from source: play vars 15621 1726882607.70989: variable 'interface' from source: set_fact 15621 1726882607.71070: variable 'interface' from source: set_fact 15621 1726882607.71128: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882607.71214: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882607.71590: variable 'network_connections' from source: play vars 15621 1726882607.71594: variable 'profile' from source: play vars 15621 1726882607.71699: variable 'profile' from source: play vars 15621 1726882607.71702: variable 'interface' from source: set_fact 15621 1726882607.71800: variable 'interface' from source: set_fact 15621 1726882607.72028: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882607.72031: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882607.72291: variable 'network_connections' from source: play vars 15621 1726882607.72301: variable 'profile' from source: play vars 15621 1726882607.72383: variable 'profile' from source: play vars 15621 1726882607.72393: variable 'interface' from source: set_fact 15621 1726882607.72509: variable 'interface' from source: set_fact 15621 1726882607.72575: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882607.72655: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882607.72666: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882607.72744: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882607.73007: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882607.73593: variable 'network_connections' from source: play vars 15621 1726882607.73604: variable 'profile' from source: play vars 15621 1726882607.73684: variable 'profile' from source: play vars 15621 1726882607.73693: variable 'interface' from source: set_fact 15621 1726882607.73766: variable 'interface' from source: set_fact 15621 1726882607.73793: variable 'ansible_distribution' from source: facts 15621 1726882607.73802: variable '__network_rh_distros' from source: role '' defaults 15621 1726882607.73812: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.73891: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882607.74039: variable 'ansible_distribution' from source: facts 15621 1726882607.74050: variable '__network_rh_distros' from source: role '' defaults 15621 1726882607.74061: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.74075: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882607.74276: variable 'ansible_distribution' from source: facts 15621 1726882607.74286: variable '__network_rh_distros' from source: role '' defaults 15621 1726882607.74297: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.74346: variable 'network_provider' from source: set_fact 15621 1726882607.74367: variable 'ansible_facts' from source: unknown 15621 1726882607.75416: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15621 1726882607.75421: when evaluation is False, skipping this task 15621 1726882607.75427: _execute() done 15621 1726882607.75430: dumping result to json 15621 1726882607.75433: done dumping result, returning 15621 1726882607.75436: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-af1a-5b92-000000000044] 15621 1726882607.75438: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000044 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15621 1726882607.75692: no more pending results, returning what we have 15621 1726882607.75697: results queue empty 15621 1726882607.75698: checking for any_errors_fatal 15621 1726882607.75704: done checking for any_errors_fatal 15621 1726882607.75705: checking for max_fail_percentage 15621 1726882607.75708: done checking for max_fail_percentage 15621 1726882607.75709: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.75711: done checking to see if all hosts have failed 15621 1726882607.75711: getting the remaining hosts for this loop 15621 1726882607.75713: done getting the remaining hosts for this loop 15621 1726882607.75718: getting the next task for host managed_node3 15621 1726882607.75739: done getting next task for host managed_node3 15621 1726882607.75744: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882607.75746: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.75934: getting variables 15621 1726882607.75937: in VariableManager get_vars() 15621 1726882607.75977: Calling all_inventory to load vars for managed_node3 15621 1726882607.75979: Calling groups_inventory to load vars for managed_node3 15621 1726882607.75982: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.75992: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.76000: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.76003: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.76539: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000044 15621 1726882607.76543: WORKER PROCESS EXITING 15621 1726882607.77826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.80137: done with get_vars() 15621 1726882607.80170: done getting variables 15621 1726882607.80243: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:47 -0400 (0:00:00.178) 0:00:39.882 ****** 15621 1726882607.80285: entering _queue_task() for managed_node3/package 15621 1726882607.80730: worker is 1 (out of 1 available) 15621 1726882607.80744: exiting _queue_task() for managed_node3/package 15621 1726882607.80758: done queuing things up, now waiting for results queue to drain 15621 1726882607.80759: waiting for pending results... 15621 1726882607.81028: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882607.81135: in run() - task 0affc7ec-ae25-af1a-5b92-000000000045 15621 1726882607.81165: variable 'ansible_search_path' from source: unknown 15621 1726882607.81179: variable 'ansible_search_path' from source: unknown 15621 1726882607.81227: calling self._execute() 15621 1726882607.81340: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.81360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.81377: variable 'omit' from source: magic vars 15621 1726882607.81701: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.81711: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.81804: variable 'network_state' from source: role '' defaults 15621 1726882607.81811: Evaluated conditional (network_state != {}): False 15621 1726882607.81815: when evaluation is False, skipping this task 15621 1726882607.81818: _execute() done 15621 1726882607.81820: dumping result to json 15621 1726882607.81827: done dumping result, returning 15621 1726882607.81834: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000045] 15621 1726882607.81841: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000045 15621 1726882607.81934: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000045 15621 1726882607.81938: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882607.81989: no more pending results, returning what we have 15621 1726882607.81993: results queue empty 15621 1726882607.81994: checking for any_errors_fatal 15621 1726882607.82002: done checking for any_errors_fatal 15621 1726882607.82003: checking for max_fail_percentage 15621 1726882607.82004: done checking for max_fail_percentage 15621 1726882607.82006: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.82006: done checking to see if all hosts have failed 15621 1726882607.82007: getting the remaining hosts for this loop 15621 1726882607.82008: done getting the remaining hosts for this loop 15621 1726882607.82012: getting the next task for host managed_node3 15621 1726882607.82017: done getting next task for host managed_node3 15621 1726882607.82021: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882607.82025: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.82040: getting variables 15621 1726882607.82041: in VariableManager get_vars() 15621 1726882607.82075: Calling all_inventory to load vars for managed_node3 15621 1726882607.82078: Calling groups_inventory to load vars for managed_node3 15621 1726882607.82080: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.82089: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.82092: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.82094: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.84033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.86859: done with get_vars() 15621 1726882607.86900: done getting variables 15621 1726882607.86974: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:47 -0400 (0:00:00.067) 0:00:39.949 ****** 15621 1726882607.87014: entering _queue_task() for managed_node3/package 15621 1726882607.87517: worker is 1 (out of 1 available) 15621 1726882607.87533: exiting _queue_task() for managed_node3/package 15621 1726882607.87544: done queuing things up, now waiting for results queue to drain 15621 1726882607.87546: waiting for pending results... 15621 1726882607.87943: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882607.87958: in run() - task 0affc7ec-ae25-af1a-5b92-000000000046 15621 1726882607.87985: variable 'ansible_search_path' from source: unknown 15621 1726882607.87994: variable 'ansible_search_path' from source: unknown 15621 1726882607.88061: calling self._execute() 15621 1726882607.88166: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.88190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.88290: variable 'omit' from source: magic vars 15621 1726882607.88652: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.88670: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.88828: variable 'network_state' from source: role '' defaults 15621 1726882607.88850: Evaluated conditional (network_state != {}): False 15621 1726882607.88857: when evaluation is False, skipping this task 15621 1726882607.88861: _execute() done 15621 1726882607.88864: dumping result to json 15621 1726882607.88866: done dumping result, returning 15621 1726882607.88869: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000046] 15621 1726882607.88875: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000046 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882607.89030: no more pending results, returning what we have 15621 1726882607.89034: results queue empty 15621 1726882607.89035: checking for any_errors_fatal 15621 1726882607.89041: done checking for any_errors_fatal 15621 1726882607.89042: checking for max_fail_percentage 15621 1726882607.89044: done checking for max_fail_percentage 15621 1726882607.89045: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.89046: done checking to see if all hosts have failed 15621 1726882607.89047: getting the remaining hosts for this loop 15621 1726882607.89048: done getting the remaining hosts for this loop 15621 1726882607.89052: getting the next task for host managed_node3 15621 1726882607.89058: done getting next task for host managed_node3 15621 1726882607.89062: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882607.89065: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.89083: getting variables 15621 1726882607.89085: in VariableManager get_vars() 15621 1726882607.89120: Calling all_inventory to load vars for managed_node3 15621 1726882607.89125: Calling groups_inventory to load vars for managed_node3 15621 1726882607.89127: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.89137: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.89139: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.89142: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.89726: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000046 15621 1726882607.90219: WORKER PROCESS EXITING 15621 1726882607.90231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.92075: done with get_vars() 15621 1726882607.92103: done getting variables 15621 1726882607.92168: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:47 -0400 (0:00:00.051) 0:00:40.001 ****** 15621 1726882607.92201: entering _queue_task() for managed_node3/service 15621 1726882607.92552: worker is 1 (out of 1 available) 15621 1726882607.92565: exiting _queue_task() for managed_node3/service 15621 1726882607.92577: done queuing things up, now waiting for results queue to drain 15621 1726882607.92579: waiting for pending results... 15621 1726882607.92786: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882607.92878: in run() - task 0affc7ec-ae25-af1a-5b92-000000000047 15621 1726882607.92889: variable 'ansible_search_path' from source: unknown 15621 1726882607.92892: variable 'ansible_search_path' from source: unknown 15621 1726882607.92927: calling self._execute() 15621 1726882607.93005: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882607.93010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882607.93023: variable 'omit' from source: magic vars 15621 1726882607.93321: variable 'ansible_distribution_major_version' from source: facts 15621 1726882607.93327: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882607.93418: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882607.93563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882607.95556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882607.95612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882607.95647: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882607.95674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882607.95697: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882607.95826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.95832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.95836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.95862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.95874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.95914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.95936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.95959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.95989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.96000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.96035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882607.96057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882607.96076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.96103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882607.96114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882607.96251: variable 'network_connections' from source: play vars 15621 1726882607.96265: variable 'profile' from source: play vars 15621 1726882607.96325: variable 'profile' from source: play vars 15621 1726882607.96328: variable 'interface' from source: set_fact 15621 1726882607.96383: variable 'interface' from source: set_fact 15621 1726882607.96452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882607.96586: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882607.96617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882607.96643: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882607.96665: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882607.96706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882607.96722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882607.96743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882607.96761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882607.96802: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882607.96978: variable 'network_connections' from source: play vars 15621 1726882607.96981: variable 'profile' from source: play vars 15621 1726882607.97031: variable 'profile' from source: play vars 15621 1726882607.97036: variable 'interface' from source: set_fact 15621 1726882607.97078: variable 'interface' from source: set_fact 15621 1726882607.97096: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882607.97099: when evaluation is False, skipping this task 15621 1726882607.97102: _execute() done 15621 1726882607.97107: dumping result to json 15621 1726882607.97110: done dumping result, returning 15621 1726882607.97119: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000047] 15621 1726882607.97131: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000047 15621 1726882607.97220: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000047 15621 1726882607.97226: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882607.97304: no more pending results, returning what we have 15621 1726882607.97308: results queue empty 15621 1726882607.97308: checking for any_errors_fatal 15621 1726882607.97315: done checking for any_errors_fatal 15621 1726882607.97316: checking for max_fail_percentage 15621 1726882607.97318: done checking for max_fail_percentage 15621 1726882607.97319: checking to see if all hosts have failed and the running result is not ok 15621 1726882607.97320: done checking to see if all hosts have failed 15621 1726882607.97320: getting the remaining hosts for this loop 15621 1726882607.97324: done getting the remaining hosts for this loop 15621 1726882607.97328: getting the next task for host managed_node3 15621 1726882607.97333: done getting next task for host managed_node3 15621 1726882607.97337: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882607.97339: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882607.97355: getting variables 15621 1726882607.97356: in VariableManager get_vars() 15621 1726882607.97395: Calling all_inventory to load vars for managed_node3 15621 1726882607.97398: Calling groups_inventory to load vars for managed_node3 15621 1726882607.97400: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882607.97410: Calling all_plugins_play to load vars for managed_node3 15621 1726882607.97412: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882607.97415: Calling groups_plugins_play to load vars for managed_node3 15621 1726882607.98409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882607.99580: done with get_vars() 15621 1726882607.99606: done getting variables 15621 1726882607.99659: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:47 -0400 (0:00:00.074) 0:00:40.076 ****** 15621 1726882607.99685: entering _queue_task() for managed_node3/service 15621 1726882607.99981: worker is 1 (out of 1 available) 15621 1726882607.99994: exiting _queue_task() for managed_node3/service 15621 1726882608.00006: done queuing things up, now waiting for results queue to drain 15621 1726882608.00008: waiting for pending results... 15621 1726882608.00202: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882608.00279: in run() - task 0affc7ec-ae25-af1a-5b92-000000000048 15621 1726882608.00292: variable 'ansible_search_path' from source: unknown 15621 1726882608.00296: variable 'ansible_search_path' from source: unknown 15621 1726882608.00330: calling self._execute() 15621 1726882608.00406: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.00411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.00420: variable 'omit' from source: magic vars 15621 1726882608.00717: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.00728: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882608.00845: variable 'network_provider' from source: set_fact 15621 1726882608.00849: variable 'network_state' from source: role '' defaults 15621 1726882608.00858: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15621 1726882608.00864: variable 'omit' from source: magic vars 15621 1726882608.00899: variable 'omit' from source: magic vars 15621 1726882608.00925: variable 'network_service_name' from source: role '' defaults 15621 1726882608.00978: variable 'network_service_name' from source: role '' defaults 15621 1726882608.01055: variable '__network_provider_setup' from source: role '' defaults 15621 1726882608.01061: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882608.01109: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882608.01116: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882608.01166: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882608.01329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882608.03124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882608.03179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882608.03208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882608.03237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882608.03256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882608.03326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.03348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.03367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.03400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.03414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.03451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.03469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.03488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.03518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.03533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.03684: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882608.03768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.03786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.03804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.03835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.03848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.03910: variable 'ansible_python' from source: facts 15621 1726882608.03929: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882608.03990: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882608.04050: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882608.04140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.04160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.04182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.04207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.04218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.04257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.04290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.04301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.04329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.04340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.04445: variable 'network_connections' from source: play vars 15621 1726882608.04452: variable 'profile' from source: play vars 15621 1726882608.04511: variable 'profile' from source: play vars 15621 1726882608.04515: variable 'interface' from source: set_fact 15621 1726882608.04563: variable 'interface' from source: set_fact 15621 1726882608.04642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882608.04778: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882608.04815: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882608.04851: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882608.04882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882608.04928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882608.04952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882608.04977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.05001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882608.05039: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882608.05228: variable 'network_connections' from source: play vars 15621 1726882608.05235: variable 'profile' from source: play vars 15621 1726882608.05292: variable 'profile' from source: play vars 15621 1726882608.05295: variable 'interface' from source: set_fact 15621 1726882608.05341: variable 'interface' from source: set_fact 15621 1726882608.05366: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882608.05426: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882608.05627: variable 'network_connections' from source: play vars 15621 1726882608.05631: variable 'profile' from source: play vars 15621 1726882608.05684: variable 'profile' from source: play vars 15621 1726882608.05687: variable 'interface' from source: set_fact 15621 1726882608.05746: variable 'interface' from source: set_fact 15621 1726882608.05765: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882608.05824: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882608.06026: variable 'network_connections' from source: play vars 15621 1726882608.06030: variable 'profile' from source: play vars 15621 1726882608.06083: variable 'profile' from source: play vars 15621 1726882608.06087: variable 'interface' from source: set_fact 15621 1726882608.06144: variable 'interface' from source: set_fact 15621 1726882608.06185: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882608.06230: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882608.06236: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882608.06283: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882608.06428: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882608.06770: variable 'network_connections' from source: play vars 15621 1726882608.06777: variable 'profile' from source: play vars 15621 1726882608.06824: variable 'profile' from source: play vars 15621 1726882608.06828: variable 'interface' from source: set_fact 15621 1726882608.06879: variable 'interface' from source: set_fact 15621 1726882608.06887: variable 'ansible_distribution' from source: facts 15621 1726882608.06890: variable '__network_rh_distros' from source: role '' defaults 15621 1726882608.06896: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.06907: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882608.07030: variable 'ansible_distribution' from source: facts 15621 1726882608.07034: variable '__network_rh_distros' from source: role '' defaults 15621 1726882608.07037: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.07044: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882608.07163: variable 'ansible_distribution' from source: facts 15621 1726882608.07166: variable '__network_rh_distros' from source: role '' defaults 15621 1726882608.07170: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.07199: variable 'network_provider' from source: set_fact 15621 1726882608.07215: variable 'omit' from source: magic vars 15621 1726882608.07240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882608.07263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882608.07277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882608.07292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882608.07300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882608.07332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882608.07335: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.07338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.07413: Set connection var ansible_connection to ssh 15621 1726882608.07420: Set connection var ansible_shell_executable to /bin/sh 15621 1726882608.07428: Set connection var ansible_timeout to 10 15621 1726882608.07431: Set connection var ansible_shell_type to sh 15621 1726882608.07436: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882608.07442: Set connection var ansible_pipelining to False 15621 1726882608.07466: variable 'ansible_shell_executable' from source: unknown 15621 1726882608.07470: variable 'ansible_connection' from source: unknown 15621 1726882608.07475: variable 'ansible_module_compression' from source: unknown 15621 1726882608.07477: variable 'ansible_shell_type' from source: unknown 15621 1726882608.07480: variable 'ansible_shell_executable' from source: unknown 15621 1726882608.07483: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.07489: variable 'ansible_pipelining' from source: unknown 15621 1726882608.07492: variable 'ansible_timeout' from source: unknown 15621 1726882608.07494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.07566: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882608.07578: variable 'omit' from source: magic vars 15621 1726882608.07581: starting attempt loop 15621 1726882608.07584: running the handler 15621 1726882608.07643: variable 'ansible_facts' from source: unknown 15621 1726882608.08146: _low_level_execute_command(): starting 15621 1726882608.08152: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882608.08691: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.08695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.08698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.08700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.08758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882608.08761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.08763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.08862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.10637: stdout chunk (state=3): >>>/root <<< 15621 1726882608.10736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.10801: stderr chunk (state=3): >>><<< 15621 1726882608.10805: stdout chunk (state=3): >>><<< 15621 1726882608.10828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882608.10840: _low_level_execute_command(): starting 15621 1726882608.10846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513 `" && echo ansible-tmp-1726882608.108277-17051-27281723043513="` echo /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513 `" ) && sleep 0' 15621 1726882608.11340: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882608.11343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.11346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882608.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882608.11350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.11404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882608.11407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.11411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.11503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.13461: stdout chunk (state=3): >>>ansible-tmp-1726882608.108277-17051-27281723043513=/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513 <<< 15621 1726882608.13576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.13630: stderr chunk (state=3): >>><<< 15621 1726882608.13633: stdout chunk (state=3): >>><<< 15621 1726882608.13649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882608.108277-17051-27281723043513=/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882608.13682: variable 'ansible_module_compression' from source: unknown 15621 1726882608.13726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15621 1726882608.13786: variable 'ansible_facts' from source: unknown 15621 1726882608.13954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py 15621 1726882608.14191: Sending initial data 15621 1726882608.14195: Sent initial data (154 bytes) 15621 1726882608.14838: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882608.14842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882608.14844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.14847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882608.14849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.14862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882608.14874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.14906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.15006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.16626: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15621 1726882608.16634: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882608.16707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882608.16800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpeo08vucw /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py <<< 15621 1726882608.16804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py" <<< 15621 1726882608.16883: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpeo08vucw" to remote "/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py" <<< 15621 1726882608.16887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py" <<< 15621 1726882608.18727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.18731: stdout chunk (state=3): >>><<< 15621 1726882608.18733: stderr chunk (state=3): >>><<< 15621 1726882608.18736: done transferring module to remote 15621 1726882608.18738: _low_level_execute_command(): starting 15621 1726882608.18740: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/ /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py && sleep 0' 15621 1726882608.19342: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882608.19351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882608.19369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.19386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882608.19466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882608.19470: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882608.19472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.19474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882608.19477: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882608.19479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15621 1726882608.19480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882608.19482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.19484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882608.19540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.19556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882608.19567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.19592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.19707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.21593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.21712: stderr chunk (state=3): >>><<< 15621 1726882608.21716: stdout chunk (state=3): >>><<< 15621 1726882608.21822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882608.21828: _low_level_execute_command(): starting 15621 1726882608.21831: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/AnsiballZ_systemd.py && sleep 0' 15621 1726882608.22438: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882608.22455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882608.22477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882608.22499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882608.22545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882608.22560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882608.22655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.22676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.22803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.54472: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11829248", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3544752128", "CPUUsageNSec": "1857998000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 15621 1726882608.54537: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15621 1726882608.56383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.56395: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882608.56459: stderr chunk (state=3): >>><<< 15621 1726882608.56462: stdout chunk (state=3): >>><<< 15621 1726882608.56480: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11829248", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3544752128", "CPUUsageNSec": "1857998000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882608.56610: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882608.56629: _low_level_execute_command(): starting 15621 1726882608.56634: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882608.108277-17051-27281723043513/ > /dev/null 2>&1 && sleep 0' 15621 1726882608.57447: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.57473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882608.59378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882608.59424: stderr chunk (state=3): >>><<< 15621 1726882608.59429: stdout chunk (state=3): >>><<< 15621 1726882608.59442: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882608.59449: handler run complete 15621 1726882608.59490: attempt loop complete, returning result 15621 1726882608.59494: _execute() done 15621 1726882608.59500: dumping result to json 15621 1726882608.59515: done dumping result, returning 15621 1726882608.59525: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-af1a-5b92-000000000048] 15621 1726882608.59531: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000048 15621 1726882608.60031: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000048 15621 1726882608.60035: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882608.60072: no more pending results, returning what we have 15621 1726882608.60075: results queue empty 15621 1726882608.60075: checking for any_errors_fatal 15621 1726882608.60078: done checking for any_errors_fatal 15621 1726882608.60078: checking for max_fail_percentage 15621 1726882608.60079: done checking for max_fail_percentage 15621 1726882608.60080: checking to see if all hosts have failed and the running result is not ok 15621 1726882608.60081: done checking to see if all hosts have failed 15621 1726882608.60081: getting the remaining hosts for this loop 15621 1726882608.60082: done getting the remaining hosts for this loop 15621 1726882608.60084: getting the next task for host managed_node3 15621 1726882608.60088: done getting next task for host managed_node3 15621 1726882608.60090: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882608.60092: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882608.60098: getting variables 15621 1726882608.60099: in VariableManager get_vars() 15621 1726882608.60127: Calling all_inventory to load vars for managed_node3 15621 1726882608.60133: Calling groups_inventory to load vars for managed_node3 15621 1726882608.60137: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882608.60151: Calling all_plugins_play to load vars for managed_node3 15621 1726882608.60154: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882608.60158: Calling groups_plugins_play to load vars for managed_node3 15621 1726882608.61609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882608.63672: done with get_vars() 15621 1726882608.63698: done getting variables 15621 1726882608.63766: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:48 -0400 (0:00:00.641) 0:00:40.717 ****** 15621 1726882608.63797: entering _queue_task() for managed_node3/service 15621 1726882608.64140: worker is 1 (out of 1 available) 15621 1726882608.64152: exiting _queue_task() for managed_node3/service 15621 1726882608.64165: done queuing things up, now waiting for results queue to drain 15621 1726882608.64167: waiting for pending results... 15621 1726882608.64641: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882608.64647: in run() - task 0affc7ec-ae25-af1a-5b92-000000000049 15621 1726882608.64650: variable 'ansible_search_path' from source: unknown 15621 1726882608.64653: variable 'ansible_search_path' from source: unknown 15621 1726882608.64666: calling self._execute() 15621 1726882608.64771: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.64785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.64801: variable 'omit' from source: magic vars 15621 1726882608.65209: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.65232: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882608.65363: variable 'network_provider' from source: set_fact 15621 1726882608.65376: Evaluated conditional (network_provider == "nm"): True 15621 1726882608.65484: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882608.65583: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882608.65781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882608.67537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882608.67584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882608.67612: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882608.67645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882608.67666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882608.67751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.67766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.67789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.67835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.67849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.67903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.68126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.68130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.68133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.68136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.68139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.68141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.68143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.68170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.68195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.68358: variable 'network_connections' from source: play vars 15621 1726882608.68375: variable 'profile' from source: play vars 15621 1726882608.68454: variable 'profile' from source: play vars 15621 1726882608.68464: variable 'interface' from source: set_fact 15621 1726882608.68536: variable 'interface' from source: set_fact 15621 1726882608.68615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882608.68789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882608.68835: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882608.68872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882608.68907: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882608.68959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882608.68991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882608.69031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.69083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882608.69112: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882608.69294: variable 'network_connections' from source: play vars 15621 1726882608.69298: variable 'profile' from source: play vars 15621 1726882608.69347: variable 'profile' from source: play vars 15621 1726882608.69350: variable 'interface' from source: set_fact 15621 1726882608.69396: variable 'interface' from source: set_fact 15621 1726882608.69418: Evaluated conditional (__network_wpa_supplicant_required): False 15621 1726882608.69424: when evaluation is False, skipping this task 15621 1726882608.69427: _execute() done 15621 1726882608.69440: dumping result to json 15621 1726882608.69442: done dumping result, returning 15621 1726882608.69445: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-af1a-5b92-000000000049] 15621 1726882608.69448: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000049 15621 1726882608.69538: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000049 15621 1726882608.69541: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15621 1726882608.69609: no more pending results, returning what we have 15621 1726882608.69612: results queue empty 15621 1726882608.69613: checking for any_errors_fatal 15621 1726882608.69636: done checking for any_errors_fatal 15621 1726882608.69637: checking for max_fail_percentage 15621 1726882608.69639: done checking for max_fail_percentage 15621 1726882608.69639: checking to see if all hosts have failed and the running result is not ok 15621 1726882608.69640: done checking to see if all hosts have failed 15621 1726882608.69641: getting the remaining hosts for this loop 15621 1726882608.69642: done getting the remaining hosts for this loop 15621 1726882608.69647: getting the next task for host managed_node3 15621 1726882608.69652: done getting next task for host managed_node3 15621 1726882608.69655: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882608.69657: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882608.69675: getting variables 15621 1726882608.69677: in VariableManager get_vars() 15621 1726882608.69712: Calling all_inventory to load vars for managed_node3 15621 1726882608.69714: Calling groups_inventory to load vars for managed_node3 15621 1726882608.69716: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882608.69737: Calling all_plugins_play to load vars for managed_node3 15621 1726882608.69740: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882608.69744: Calling groups_plugins_play to load vars for managed_node3 15621 1726882608.70786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882608.72652: done with get_vars() 15621 1726882608.72689: done getting variables 15621 1726882608.72759: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:48 -0400 (0:00:00.089) 0:00:40.807 ****** 15621 1726882608.72795: entering _queue_task() for managed_node3/service 15621 1726882608.73191: worker is 1 (out of 1 available) 15621 1726882608.73205: exiting _queue_task() for managed_node3/service 15621 1726882608.73219: done queuing things up, now waiting for results queue to drain 15621 1726882608.73221: waiting for pending results... 15621 1726882608.73644: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882608.73663: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004a 15621 1726882608.73687: variable 'ansible_search_path' from source: unknown 15621 1726882608.73694: variable 'ansible_search_path' from source: unknown 15621 1726882608.73744: calling self._execute() 15621 1726882608.73849: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.73862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.73880: variable 'omit' from source: magic vars 15621 1726882608.74304: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.74321: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882608.74458: variable 'network_provider' from source: set_fact 15621 1726882608.74468: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882608.74479: when evaluation is False, skipping this task 15621 1726882608.74487: _execute() done 15621 1726882608.74500: dumping result to json 15621 1726882608.74510: done dumping result, returning 15621 1726882608.74526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-af1a-5b92-00000000004a] 15621 1726882608.74538: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004a skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882608.74778: no more pending results, returning what we have 15621 1726882608.74782: results queue empty 15621 1726882608.74783: checking for any_errors_fatal 15621 1726882608.74790: done checking for any_errors_fatal 15621 1726882608.74791: checking for max_fail_percentage 15621 1726882608.74794: done checking for max_fail_percentage 15621 1726882608.74795: checking to see if all hosts have failed and the running result is not ok 15621 1726882608.74796: done checking to see if all hosts have failed 15621 1726882608.74797: getting the remaining hosts for this loop 15621 1726882608.74798: done getting the remaining hosts for this loop 15621 1726882608.74803: getting the next task for host managed_node3 15621 1726882608.74809: done getting next task for host managed_node3 15621 1726882608.74814: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882608.74816: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882608.74836: getting variables 15621 1726882608.74838: in VariableManager get_vars() 15621 1726882608.74886: Calling all_inventory to load vars for managed_node3 15621 1726882608.74890: Calling groups_inventory to load vars for managed_node3 15621 1726882608.74892: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882608.74908: Calling all_plugins_play to load vars for managed_node3 15621 1726882608.74911: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882608.74914: Calling groups_plugins_play to load vars for managed_node3 15621 1726882608.74926: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004a 15621 1726882608.74929: WORKER PROCESS EXITING 15621 1726882608.76841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882608.79119: done with get_vars() 15621 1726882608.79146: done getting variables 15621 1726882608.79212: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:48 -0400 (0:00:00.064) 0:00:40.871 ****** 15621 1726882608.79249: entering _queue_task() for managed_node3/copy 15621 1726882608.79611: worker is 1 (out of 1 available) 15621 1726882608.79827: exiting _queue_task() for managed_node3/copy 15621 1726882608.79838: done queuing things up, now waiting for results queue to drain 15621 1726882608.79840: waiting for pending results... 15621 1726882608.79944: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882608.80076: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004b 15621 1726882608.80098: variable 'ansible_search_path' from source: unknown 15621 1726882608.80106: variable 'ansible_search_path' from source: unknown 15621 1726882608.80152: calling self._execute() 15621 1726882608.80255: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.80266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.80290: variable 'omit' from source: magic vars 15621 1726882608.80691: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.80713: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882608.80852: variable 'network_provider' from source: set_fact 15621 1726882608.80865: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882608.80877: when evaluation is False, skipping this task 15621 1726882608.80889: _execute() done 15621 1726882608.80940: dumping result to json 15621 1726882608.80943: done dumping result, returning 15621 1726882608.80947: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-af1a-5b92-00000000004b] 15621 1726882608.80950: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004b skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15621 1726882608.81101: no more pending results, returning what we have 15621 1726882608.81105: results queue empty 15621 1726882608.81106: checking for any_errors_fatal 15621 1726882608.81113: done checking for any_errors_fatal 15621 1726882608.81114: checking for max_fail_percentage 15621 1726882608.81116: done checking for max_fail_percentage 15621 1726882608.81117: checking to see if all hosts have failed and the running result is not ok 15621 1726882608.81118: done checking to see if all hosts have failed 15621 1726882608.81119: getting the remaining hosts for this loop 15621 1726882608.81121: done getting the remaining hosts for this loop 15621 1726882608.81128: getting the next task for host managed_node3 15621 1726882608.81134: done getting next task for host managed_node3 15621 1726882608.81139: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882608.81141: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882608.81159: getting variables 15621 1726882608.81161: in VariableManager get_vars() 15621 1726882608.81209: Calling all_inventory to load vars for managed_node3 15621 1726882608.81212: Calling groups_inventory to load vars for managed_node3 15621 1726882608.81215: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882608.81530: Calling all_plugins_play to load vars for managed_node3 15621 1726882608.81534: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882608.81539: Calling groups_plugins_play to load vars for managed_node3 15621 1726882608.82239: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004b 15621 1726882608.82243: WORKER PROCESS EXITING 15621 1726882608.83183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882608.85303: done with get_vars() 15621 1726882608.85329: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:48 -0400 (0:00:00.061) 0:00:40.933 ****** 15621 1726882608.85417: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882608.85733: worker is 1 (out of 1 available) 15621 1726882608.85746: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882608.85758: done queuing things up, now waiting for results queue to drain 15621 1726882608.85760: waiting for pending results... 15621 1726882608.86055: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882608.86187: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004c 15621 1726882608.86210: variable 'ansible_search_path' from source: unknown 15621 1726882608.86218: variable 'ansible_search_path' from source: unknown 15621 1726882608.86263: calling self._execute() 15621 1726882608.86366: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.86386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.86405: variable 'omit' from source: magic vars 15621 1726882608.86805: variable 'ansible_distribution_major_version' from source: facts 15621 1726882608.86828: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882608.86842: variable 'omit' from source: magic vars 15621 1726882608.86894: variable 'omit' from source: magic vars 15621 1726882608.87086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882608.90827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882608.91037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882608.91232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882608.91240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882608.91244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882608.91339: variable 'network_provider' from source: set_fact 15621 1726882608.91514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882608.91556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882608.91606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882608.91660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882608.91694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882608.91777: variable 'omit' from source: magic vars 15621 1726882608.91927: variable 'omit' from source: magic vars 15621 1726882608.92114: variable 'network_connections' from source: play vars 15621 1726882608.92123: variable 'profile' from source: play vars 15621 1726882608.92169: variable 'profile' from source: play vars 15621 1726882608.92182: variable 'interface' from source: set_fact 15621 1726882608.92268: variable 'interface' from source: set_fact 15621 1726882608.92462: variable 'omit' from source: magic vars 15621 1726882608.92477: variable '__lsr_ansible_managed' from source: task vars 15621 1726882608.92572: variable '__lsr_ansible_managed' from source: task vars 15621 1726882608.92847: Loaded config def from plugin (lookup/template) 15621 1726882608.92851: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15621 1726882608.92879: File lookup term: get_ansible_managed.j2 15621 1726882608.92884: variable 'ansible_search_path' from source: unknown 15621 1726882608.92889: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15621 1726882608.92905: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15621 1726882608.92918: variable 'ansible_search_path' from source: unknown 15621 1726882608.97377: variable 'ansible_managed' from source: unknown 15621 1726882608.97465: variable 'omit' from source: magic vars 15621 1726882608.97489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882608.97511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882608.97526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882608.97540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882608.97549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882608.97578: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882608.97582: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.97584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.97653: Set connection var ansible_connection to ssh 15621 1726882608.97660: Set connection var ansible_shell_executable to /bin/sh 15621 1726882608.97666: Set connection var ansible_timeout to 10 15621 1726882608.97669: Set connection var ansible_shell_type to sh 15621 1726882608.97676: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882608.97679: Set connection var ansible_pipelining to False 15621 1726882608.97701: variable 'ansible_shell_executable' from source: unknown 15621 1726882608.97704: variable 'ansible_connection' from source: unknown 15621 1726882608.97708: variable 'ansible_module_compression' from source: unknown 15621 1726882608.97711: variable 'ansible_shell_type' from source: unknown 15621 1726882608.97714: variable 'ansible_shell_executable' from source: unknown 15621 1726882608.97716: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882608.97718: variable 'ansible_pipelining' from source: unknown 15621 1726882608.97721: variable 'ansible_timeout' from source: unknown 15621 1726882608.97728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882608.97827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882608.97842: variable 'omit' from source: magic vars 15621 1726882608.97845: starting attempt loop 15621 1726882608.97848: running the handler 15621 1726882608.97857: _low_level_execute_command(): starting 15621 1726882608.97863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882608.98616: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882608.98633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882608.98744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.00532: stdout chunk (state=3): >>>/root <<< 15621 1726882609.00667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882609.00762: stderr chunk (state=3): >>><<< 15621 1726882609.00768: stdout chunk (state=3): >>><<< 15621 1726882609.00924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882609.00928: _low_level_execute_command(): starting 15621 1726882609.00931: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053 `" && echo ansible-tmp-1726882609.0080657-17076-2558451193053="` echo /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053 `" ) && sleep 0' 15621 1726882609.01579: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882609.01627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882609.01633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882609.01657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882609.01759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.03805: stdout chunk (state=3): >>>ansible-tmp-1726882609.0080657-17076-2558451193053=/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053 <<< 15621 1726882609.03918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882609.03960: stderr chunk (state=3): >>><<< 15621 1726882609.03963: stdout chunk (state=3): >>><<< 15621 1726882609.03980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882609.0080657-17076-2558451193053=/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882609.04017: variable 'ansible_module_compression' from source: unknown 15621 1726882609.04054: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15621 1726882609.04085: variable 'ansible_facts' from source: unknown 15621 1726882609.04146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py 15621 1726882609.04244: Sending initial data 15621 1726882609.04248: Sent initial data (166 bytes) 15621 1726882609.04692: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882609.04695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882609.04698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882609.04704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882609.04760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882609.04764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882609.04846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.06525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882609.06627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882609.06717: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvpuz9bpw /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py <<< 15621 1726882609.06722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py" <<< 15621 1726882609.06843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvpuz9bpw" to remote "/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py" <<< 15621 1726882609.08012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882609.08098: stderr chunk (state=3): >>><<< 15621 1726882609.08101: stdout chunk (state=3): >>><<< 15621 1726882609.08116: done transferring module to remote 15621 1726882609.08127: _low_level_execute_command(): starting 15621 1726882609.08132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/ /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py && sleep 0' 15621 1726882609.08689: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882609.08697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882609.08701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882609.08704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882609.08706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882609.08708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882609.08799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882609.08804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882609.08916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.10826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882609.10894: stderr chunk (state=3): >>><<< 15621 1726882609.10901: stdout chunk (state=3): >>><<< 15621 1726882609.10925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882609.10928: _low_level_execute_command(): starting 15621 1726882609.10931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/AnsiballZ_network_connections.py && sleep 0' 15621 1726882609.11419: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882609.11425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882609.11428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882609.11430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882609.11432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882609.11488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882609.11492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882609.11576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.44113: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15621 1726882609.46338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882609.46342: stdout chunk (state=3): >>><<< 15621 1726882609.46345: stderr chunk (state=3): >>><<< 15621 1726882609.46366: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882609.46447: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882609.46451: _low_level_execute_command(): starting 15621 1726882609.46454: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882609.0080657-17076-2558451193053/ > /dev/null 2>&1 && sleep 0' 15621 1726882609.47248: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882609.47277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882609.47391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882609.49630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882609.49633: stdout chunk (state=3): >>><<< 15621 1726882609.49636: stderr chunk (state=3): >>><<< 15621 1726882609.49639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882609.49642: handler run complete 15621 1726882609.49645: attempt loop complete, returning result 15621 1726882609.49646: _execute() done 15621 1726882609.49649: dumping result to json 15621 1726882609.49652: done dumping result, returning 15621 1726882609.49654: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-af1a-5b92-00000000004c] 15621 1726882609.49657: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004c 15621 1726882609.49736: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004c 15621 1726882609.49740: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15621 1726882609.49856: no more pending results, returning what we have 15621 1726882609.49860: results queue empty 15621 1726882609.49861: checking for any_errors_fatal 15621 1726882609.49868: done checking for any_errors_fatal 15621 1726882609.49869: checking for max_fail_percentage 15621 1726882609.49871: done checking for max_fail_percentage 15621 1726882609.49871: checking to see if all hosts have failed and the running result is not ok 15621 1726882609.49873: done checking to see if all hosts have failed 15621 1726882609.49873: getting the remaining hosts for this loop 15621 1726882609.49875: done getting the remaining hosts for this loop 15621 1726882609.49880: getting the next task for host managed_node3 15621 1726882609.49888: done getting next task for host managed_node3 15621 1726882609.49892: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882609.49898: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882609.49910: getting variables 15621 1726882609.49912: in VariableManager get_vars() 15621 1726882609.50067: Calling all_inventory to load vars for managed_node3 15621 1726882609.50070: Calling groups_inventory to load vars for managed_node3 15621 1726882609.50073: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882609.50085: Calling all_plugins_play to load vars for managed_node3 15621 1726882609.50088: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882609.50092: Calling groups_plugins_play to load vars for managed_node3 15621 1726882609.52248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882609.54609: done with get_vars() 15621 1726882609.54644: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:49 -0400 (0:00:00.693) 0:00:41.626 ****** 15621 1726882609.54740: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882609.55158: worker is 1 (out of 1 available) 15621 1726882609.55173: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882609.55236: done queuing things up, now waiting for results queue to drain 15621 1726882609.55238: waiting for pending results... 15621 1726882609.55478: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882609.55608: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004d 15621 1726882609.55641: variable 'ansible_search_path' from source: unknown 15621 1726882609.55651: variable 'ansible_search_path' from source: unknown 15621 1726882609.55697: calling self._execute() 15621 1726882609.55830: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.55834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.55837: variable 'omit' from source: magic vars 15621 1726882609.56267: variable 'ansible_distribution_major_version' from source: facts 15621 1726882609.56329: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882609.56437: variable 'network_state' from source: role '' defaults 15621 1726882609.56458: Evaluated conditional (network_state != {}): False 15621 1726882609.56467: when evaluation is False, skipping this task 15621 1726882609.56475: _execute() done 15621 1726882609.56489: dumping result to json 15621 1726882609.56504: done dumping result, returning 15621 1726882609.56528: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-af1a-5b92-00000000004d] 15621 1726882609.56531: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004d 15621 1726882609.56669: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004d 15621 1726882609.56672: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882609.56734: no more pending results, returning what we have 15621 1726882609.56738: results queue empty 15621 1726882609.56739: checking for any_errors_fatal 15621 1726882609.56749: done checking for any_errors_fatal 15621 1726882609.56749: checking for max_fail_percentage 15621 1726882609.56751: done checking for max_fail_percentage 15621 1726882609.56752: checking to see if all hosts have failed and the running result is not ok 15621 1726882609.56753: done checking to see if all hosts have failed 15621 1726882609.56754: getting the remaining hosts for this loop 15621 1726882609.56755: done getting the remaining hosts for this loop 15621 1726882609.56761: getting the next task for host managed_node3 15621 1726882609.56768: done getting next task for host managed_node3 15621 1726882609.56771: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882609.56774: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882609.56790: getting variables 15621 1726882609.56792: in VariableManager get_vars() 15621 1726882609.56836: Calling all_inventory to load vars for managed_node3 15621 1726882609.56839: Calling groups_inventory to load vars for managed_node3 15621 1726882609.56841: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882609.56856: Calling all_plugins_play to load vars for managed_node3 15621 1726882609.56859: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882609.56862: Calling groups_plugins_play to load vars for managed_node3 15621 1726882609.59794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882609.63714: done with get_vars() 15621 1726882609.63747: done getting variables 15621 1726882609.63814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:49 -0400 (0:00:00.093) 0:00:41.719 ****** 15621 1726882609.64054: entering _queue_task() for managed_node3/debug 15621 1726882609.64619: worker is 1 (out of 1 available) 15621 1726882609.65034: exiting _queue_task() for managed_node3/debug 15621 1726882609.65043: done queuing things up, now waiting for results queue to drain 15621 1726882609.65045: waiting for pending results... 15621 1726882609.65440: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882609.65631: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004e 15621 1726882609.65636: variable 'ansible_search_path' from source: unknown 15621 1726882609.65639: variable 'ansible_search_path' from source: unknown 15621 1726882609.65829: calling self._execute() 15621 1726882609.65887: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.65940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.65956: variable 'omit' from source: magic vars 15621 1726882609.66884: variable 'ansible_distribution_major_version' from source: facts 15621 1726882609.66888: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882609.66890: variable 'omit' from source: magic vars 15621 1726882609.66942: variable 'omit' from source: magic vars 15621 1726882609.67036: variable 'omit' from source: magic vars 15621 1726882609.67147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882609.67320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882609.67333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882609.67358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882609.67377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882609.67415: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882609.67629: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.67633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.67762: Set connection var ansible_connection to ssh 15621 1726882609.67779: Set connection var ansible_shell_executable to /bin/sh 15621 1726882609.67792: Set connection var ansible_timeout to 10 15621 1726882609.67800: Set connection var ansible_shell_type to sh 15621 1726882609.67811: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882609.67824: Set connection var ansible_pipelining to False 15621 1726882609.67856: variable 'ansible_shell_executable' from source: unknown 15621 1726882609.68082: variable 'ansible_connection' from source: unknown 15621 1726882609.68086: variable 'ansible_module_compression' from source: unknown 15621 1726882609.68088: variable 'ansible_shell_type' from source: unknown 15621 1726882609.68090: variable 'ansible_shell_executable' from source: unknown 15621 1726882609.68092: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.68094: variable 'ansible_pipelining' from source: unknown 15621 1726882609.68096: variable 'ansible_timeout' from source: unknown 15621 1726882609.68098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.68253: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882609.68270: variable 'omit' from source: magic vars 15621 1726882609.68305: starting attempt loop 15621 1726882609.68312: running the handler 15621 1726882609.68565: variable '__network_connections_result' from source: set_fact 15621 1726882609.68672: handler run complete 15621 1726882609.68752: attempt loop complete, returning result 15621 1726882609.68927: _execute() done 15621 1726882609.68930: dumping result to json 15621 1726882609.68933: done dumping result, returning 15621 1726882609.68936: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000004e] 15621 1726882609.68938: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004e 15621 1726882609.69014: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004e 15621 1726882609.69018: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 15621 1726882609.69092: no more pending results, returning what we have 15621 1726882609.69096: results queue empty 15621 1726882609.69097: checking for any_errors_fatal 15621 1726882609.69104: done checking for any_errors_fatal 15621 1726882609.69105: checking for max_fail_percentage 15621 1726882609.69107: done checking for max_fail_percentage 15621 1726882609.69108: checking to see if all hosts have failed and the running result is not ok 15621 1726882609.69109: done checking to see if all hosts have failed 15621 1726882609.69110: getting the remaining hosts for this loop 15621 1726882609.69111: done getting the remaining hosts for this loop 15621 1726882609.69116: getting the next task for host managed_node3 15621 1726882609.69124: done getting next task for host managed_node3 15621 1726882609.69128: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882609.69131: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882609.69143: getting variables 15621 1726882609.69145: in VariableManager get_vars() 15621 1726882609.69188: Calling all_inventory to load vars for managed_node3 15621 1726882609.69192: Calling groups_inventory to load vars for managed_node3 15621 1726882609.69194: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882609.69208: Calling all_plugins_play to load vars for managed_node3 15621 1726882609.69212: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882609.69216: Calling groups_plugins_play to load vars for managed_node3 15621 1726882609.72854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882609.76950: done with get_vars() 15621 1726882609.76989: done getting variables 15621 1726882609.77267: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:49 -0400 (0:00:00.132) 0:00:41.852 ****** 15621 1726882609.77304: entering _queue_task() for managed_node3/debug 15621 1726882609.78094: worker is 1 (out of 1 available) 15621 1726882609.78109: exiting _queue_task() for managed_node3/debug 15621 1726882609.78125: done queuing things up, now waiting for results queue to drain 15621 1726882609.78126: waiting for pending results... 15621 1726882609.78742: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882609.78824: in run() - task 0affc7ec-ae25-af1a-5b92-00000000004f 15621 1726882609.78888: variable 'ansible_search_path' from source: unknown 15621 1726882609.78935: variable 'ansible_search_path' from source: unknown 15621 1726882609.79192: calling self._execute() 15621 1726882609.79328: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.79333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.79336: variable 'omit' from source: magic vars 15621 1726882609.80185: variable 'ansible_distribution_major_version' from source: facts 15621 1726882609.80204: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882609.80329: variable 'omit' from source: magic vars 15621 1726882609.80343: variable 'omit' from source: magic vars 15621 1726882609.80528: variable 'omit' from source: magic vars 15621 1726882609.80562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882609.80827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882609.80833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882609.80836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882609.80838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882609.80840: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882609.80843: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.80930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.81142: Set connection var ansible_connection to ssh 15621 1726882609.81162: Set connection var ansible_shell_executable to /bin/sh 15621 1726882609.81178: Set connection var ansible_timeout to 10 15621 1726882609.81186: Set connection var ansible_shell_type to sh 15621 1726882609.81198: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882609.81327: Set connection var ansible_pipelining to False 15621 1726882609.81331: variable 'ansible_shell_executable' from source: unknown 15621 1726882609.81333: variable 'ansible_connection' from source: unknown 15621 1726882609.81339: variable 'ansible_module_compression' from source: unknown 15621 1726882609.81342: variable 'ansible_shell_type' from source: unknown 15621 1726882609.81346: variable 'ansible_shell_executable' from source: unknown 15621 1726882609.81348: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.81350: variable 'ansible_pipelining' from source: unknown 15621 1726882609.81352: variable 'ansible_timeout' from source: unknown 15621 1726882609.81355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.81638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882609.81657: variable 'omit' from source: magic vars 15621 1726882609.81662: starting attempt loop 15621 1726882609.81664: running the handler 15621 1726882609.81827: variable '__network_connections_result' from source: set_fact 15621 1726882609.82031: variable '__network_connections_result' from source: set_fact 15621 1726882609.82260: handler run complete 15621 1726882609.82306: attempt loop complete, returning result 15621 1726882609.82310: _execute() done 15621 1726882609.82313: dumping result to json 15621 1726882609.82315: done dumping result, returning 15621 1726882609.82430: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000004f] 15621 1726882609.82434: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004f 15621 1726882609.82527: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000004f 15621 1726882609.82531: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15621 1726882609.82632: no more pending results, returning what we have 15621 1726882609.82636: results queue empty 15621 1726882609.82637: checking for any_errors_fatal 15621 1726882609.82728: done checking for any_errors_fatal 15621 1726882609.82730: checking for max_fail_percentage 15621 1726882609.82732: done checking for max_fail_percentage 15621 1726882609.82733: checking to see if all hosts have failed and the running result is not ok 15621 1726882609.82735: done checking to see if all hosts have failed 15621 1726882609.82736: getting the remaining hosts for this loop 15621 1726882609.82737: done getting the remaining hosts for this loop 15621 1726882609.82742: getting the next task for host managed_node3 15621 1726882609.82749: done getting next task for host managed_node3 15621 1726882609.82759: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882609.82761: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882609.82774: getting variables 15621 1726882609.82776: in VariableManager get_vars() 15621 1726882609.82817: Calling all_inventory to load vars for managed_node3 15621 1726882609.82820: Calling groups_inventory to load vars for managed_node3 15621 1726882609.82929: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882609.82940: Calling all_plugins_play to load vars for managed_node3 15621 1726882609.82944: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882609.82947: Calling groups_plugins_play to load vars for managed_node3 15621 1726882609.90790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882609.94182: done with get_vars() 15621 1726882609.94216: done getting variables 15621 1726882609.94508: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:49 -0400 (0:00:00.172) 0:00:42.024 ****** 15621 1726882609.94543: entering _queue_task() for managed_node3/debug 15621 1726882609.95696: worker is 1 (out of 1 available) 15621 1726882609.95713: exiting _queue_task() for managed_node3/debug 15621 1726882609.95877: done queuing things up, now waiting for results queue to drain 15621 1726882609.95880: waiting for pending results... 15621 1726882609.96205: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882609.96431: in run() - task 0affc7ec-ae25-af1a-5b92-000000000050 15621 1726882609.96662: variable 'ansible_search_path' from source: unknown 15621 1726882609.96667: variable 'ansible_search_path' from source: unknown 15621 1726882609.96710: calling self._execute() 15621 1726882609.96976: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882609.97018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882609.97053: variable 'omit' from source: magic vars 15621 1726882609.98046: variable 'ansible_distribution_major_version' from source: facts 15621 1726882609.98227: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882609.98635: variable 'network_state' from source: role '' defaults 15621 1726882609.98647: Evaluated conditional (network_state != {}): False 15621 1726882609.98650: when evaluation is False, skipping this task 15621 1726882609.98653: _execute() done 15621 1726882609.98657: dumping result to json 15621 1726882609.98659: done dumping result, returning 15621 1726882609.98669: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-af1a-5b92-000000000050] 15621 1726882609.98677: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000050 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15621 1726882609.98997: no more pending results, returning what we have 15621 1726882609.99001: results queue empty 15621 1726882609.99002: checking for any_errors_fatal 15621 1726882609.99012: done checking for any_errors_fatal 15621 1726882609.99013: checking for max_fail_percentage 15621 1726882609.99014: done checking for max_fail_percentage 15621 1726882609.99015: checking to see if all hosts have failed and the running result is not ok 15621 1726882609.99016: done checking to see if all hosts have failed 15621 1726882609.99017: getting the remaining hosts for this loop 15621 1726882609.99018: done getting the remaining hosts for this loop 15621 1726882609.99025: getting the next task for host managed_node3 15621 1726882609.99030: done getting next task for host managed_node3 15621 1726882609.99035: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882609.99037: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882609.99052: getting variables 15621 1726882609.99053: in VariableManager get_vars() 15621 1726882609.99100: Calling all_inventory to load vars for managed_node3 15621 1726882609.99104: Calling groups_inventory to load vars for managed_node3 15621 1726882609.99106: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882609.99355: Calling all_plugins_play to load vars for managed_node3 15621 1726882609.99361: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882609.99366: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.00159: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000050 15621 1726882610.00163: WORKER PROCESS EXITING 15621 1726882610.02450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.05396: done with get_vars() 15621 1726882610.05420: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:50 -0400 (0:00:00.109) 0:00:42.134 ****** 15621 1726882610.05527: entering _queue_task() for managed_node3/ping 15621 1726882610.05872: worker is 1 (out of 1 available) 15621 1726882610.05886: exiting _queue_task() for managed_node3/ping 15621 1726882610.05897: done queuing things up, now waiting for results queue to drain 15621 1726882610.05899: waiting for pending results... 15621 1726882610.06204: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882610.06324: in run() - task 0affc7ec-ae25-af1a-5b92-000000000051 15621 1726882610.06342: variable 'ansible_search_path' from source: unknown 15621 1726882610.06347: variable 'ansible_search_path' from source: unknown 15621 1726882610.06430: calling self._execute() 15621 1726882610.06488: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.06834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.06838: variable 'omit' from source: magic vars 15621 1726882610.06918: variable 'ansible_distribution_major_version' from source: facts 15621 1726882610.06927: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882610.06935: variable 'omit' from source: magic vars 15621 1726882610.06978: variable 'omit' from source: magic vars 15621 1726882610.07015: variable 'omit' from source: magic vars 15621 1726882610.07055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882610.07094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882610.07117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882610.07140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882610.07156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882610.07185: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882610.07189: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.07191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.07302: Set connection var ansible_connection to ssh 15621 1726882610.07311: Set connection var ansible_shell_executable to /bin/sh 15621 1726882610.07317: Set connection var ansible_timeout to 10 15621 1726882610.07320: Set connection var ansible_shell_type to sh 15621 1726882610.07327: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882610.07333: Set connection var ansible_pipelining to False 15621 1726882610.07363: variable 'ansible_shell_executable' from source: unknown 15621 1726882610.07366: variable 'ansible_connection' from source: unknown 15621 1726882610.07369: variable 'ansible_module_compression' from source: unknown 15621 1726882610.07379: variable 'ansible_shell_type' from source: unknown 15621 1726882610.07381: variable 'ansible_shell_executable' from source: unknown 15621 1726882610.07384: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.07386: variable 'ansible_pipelining' from source: unknown 15621 1726882610.07388: variable 'ansible_timeout' from source: unknown 15621 1726882610.07390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.07630: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882610.07634: variable 'omit' from source: magic vars 15621 1726882610.07638: starting attempt loop 15621 1726882610.07641: running the handler 15621 1726882610.07644: _low_level_execute_command(): starting 15621 1726882610.07653: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882610.08494: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882610.08499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.08502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.08561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.08655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.10438: stdout chunk (state=3): >>>/root <<< 15621 1726882610.10612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.10633: stderr chunk (state=3): >>><<< 15621 1726882610.10643: stdout chunk (state=3): >>><<< 15621 1726882610.10677: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.10702: _low_level_execute_command(): starting 15621 1726882610.10734: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321 `" && echo ansible-tmp-1726882610.1068478-17136-41932875055321="` echo /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321 `" ) && sleep 0' 15621 1726882610.11445: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.11458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.11742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.13577: stdout chunk (state=3): >>>ansible-tmp-1726882610.1068478-17136-41932875055321=/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321 <<< 15621 1726882610.13785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.13788: stdout chunk (state=3): >>><<< 15621 1726882610.13791: stderr chunk (state=3): >>><<< 15621 1726882610.13932: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882610.1068478-17136-41932875055321=/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.13936: variable 'ansible_module_compression' from source: unknown 15621 1726882610.13939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15621 1726882610.13970: variable 'ansible_facts' from source: unknown 15621 1726882610.14056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py 15621 1726882610.14246: Sending initial data 15621 1726882610.14263: Sent initial data (152 bytes) 15621 1726882610.14860: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.14873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.14887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.14917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.14937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882610.15034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882610.15046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.15065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.15173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.16788: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882610.16858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882610.16972: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp0hp3dan2 /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py <<< 15621 1726882610.16989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py" <<< 15621 1726882610.17059: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp0hp3dan2" to remote "/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py" <<< 15621 1726882610.18006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.18077: stderr chunk (state=3): >>><<< 15621 1726882610.18104: stdout chunk (state=3): >>><<< 15621 1726882610.18206: done transferring module to remote 15621 1726882610.18210: _low_level_execute_command(): starting 15621 1726882610.18213: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/ /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py && sleep 0' 15621 1726882610.18843: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.18860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.18880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.18900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.18981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.19017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882610.19039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.19055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.19170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.21057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.21170: stdout chunk (state=3): >>><<< 15621 1726882610.21173: stderr chunk (state=3): >>><<< 15621 1726882610.21176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.21179: _low_level_execute_command(): starting 15621 1726882610.21182: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/AnsiballZ_ping.py && sleep 0' 15621 1726882610.21746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.21759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.21776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.21800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.21818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882610.21916: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.21947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.22065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.38288: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15621 1726882610.39718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882610.39837: stdout chunk (state=3): >>><<< 15621 1726882610.39840: stderr chunk (state=3): >>><<< 15621 1726882610.39843: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882610.39876: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882610.40031: _low_level_execute_command(): starting 15621 1726882610.40035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882610.1068478-17136-41932875055321/ > /dev/null 2>&1 && sleep 0' 15621 1726882610.40709: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.40719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.40727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882610.40811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.40849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882610.40853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.40888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.40964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.44087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.44184: stderr chunk (state=3): >>><<< 15621 1726882610.44198: stdout chunk (state=3): >>><<< 15621 1726882610.44429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.44438: handler run complete 15621 1726882610.44441: attempt loop complete, returning result 15621 1726882610.44443: _execute() done 15621 1726882610.44445: dumping result to json 15621 1726882610.44447: done dumping result, returning 15621 1726882610.44450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-af1a-5b92-000000000051] 15621 1726882610.44452: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000051 15621 1726882610.44530: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000051 15621 1726882610.44534: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15621 1726882610.44611: no more pending results, returning what we have 15621 1726882610.44615: results queue empty 15621 1726882610.44617: checking for any_errors_fatal 15621 1726882610.44627: done checking for any_errors_fatal 15621 1726882610.44628: checking for max_fail_percentage 15621 1726882610.44629: done checking for max_fail_percentage 15621 1726882610.44630: checking to see if all hosts have failed and the running result is not ok 15621 1726882610.44632: done checking to see if all hosts have failed 15621 1726882610.44633: getting the remaining hosts for this loop 15621 1726882610.44634: done getting the remaining hosts for this loop 15621 1726882610.44639: getting the next task for host managed_node3 15621 1726882610.44647: done getting next task for host managed_node3 15621 1726882610.44650: ^ task is: TASK: meta (role_complete) 15621 1726882610.44652: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.44664: getting variables 15621 1726882610.44666: in VariableManager get_vars() 15621 1726882610.44711: Calling all_inventory to load vars for managed_node3 15621 1726882610.44715: Calling groups_inventory to load vars for managed_node3 15621 1726882610.44717: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.44853: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.44858: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.44862: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.47095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.49493: done with get_vars() 15621 1726882610.49533: done getting variables 15621 1726882610.49631: done queuing things up, now waiting for results queue to drain 15621 1726882610.49633: results queue empty 15621 1726882610.49634: checking for any_errors_fatal 15621 1726882610.49638: done checking for any_errors_fatal 15621 1726882610.49639: checking for max_fail_percentage 15621 1726882610.49640: done checking for max_fail_percentage 15621 1726882610.49641: checking to see if all hosts have failed and the running result is not ok 15621 1726882610.49642: done checking to see if all hosts have failed 15621 1726882610.49643: getting the remaining hosts for this loop 15621 1726882610.49644: done getting the remaining hosts for this loop 15621 1726882610.49647: getting the next task for host managed_node3 15621 1726882610.49651: done getting next task for host managed_node3 15621 1726882610.49653: ^ task is: TASK: meta (flush_handlers) 15621 1726882610.49654: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.49657: getting variables 15621 1726882610.49659: in VariableManager get_vars() 15621 1726882610.49673: Calling all_inventory to load vars for managed_node3 15621 1726882610.49675: Calling groups_inventory to load vars for managed_node3 15621 1726882610.49677: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.49683: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.49686: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.49689: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.51288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.53473: done with get_vars() 15621 1726882610.53509: done getting variables 15621 1726882610.53571: in VariableManager get_vars() 15621 1726882610.53592: Calling all_inventory to load vars for managed_node3 15621 1726882610.53595: Calling groups_inventory to load vars for managed_node3 15621 1726882610.53597: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.53603: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.53606: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.53609: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.55096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.57309: done with get_vars() 15621 1726882610.57346: done queuing things up, now waiting for results queue to drain 15621 1726882610.57349: results queue empty 15621 1726882610.57349: checking for any_errors_fatal 15621 1726882610.57351: done checking for any_errors_fatal 15621 1726882610.57352: checking for max_fail_percentage 15621 1726882610.57353: done checking for max_fail_percentage 15621 1726882610.57354: checking to see if all hosts have failed and the running result is not ok 15621 1726882610.57355: done checking to see if all hosts have failed 15621 1726882610.57356: getting the remaining hosts for this loop 15621 1726882610.57357: done getting the remaining hosts for this loop 15621 1726882610.57360: getting the next task for host managed_node3 15621 1726882610.57364: done getting next task for host managed_node3 15621 1726882610.57366: ^ task is: TASK: meta (flush_handlers) 15621 1726882610.57368: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.57371: getting variables 15621 1726882610.57372: in VariableManager get_vars() 15621 1726882610.57391: Calling all_inventory to load vars for managed_node3 15621 1726882610.57394: Calling groups_inventory to load vars for managed_node3 15621 1726882610.57397: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.57403: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.57406: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.57409: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.58905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.61028: done with get_vars() 15621 1726882610.61057: done getting variables 15621 1726882610.61112: in VariableManager get_vars() 15621 1726882610.61128: Calling all_inventory to load vars for managed_node3 15621 1726882610.61131: Calling groups_inventory to load vars for managed_node3 15621 1726882610.61133: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.61138: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.61141: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.61144: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.62674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.64875: done with get_vars() 15621 1726882610.64912: done queuing things up, now waiting for results queue to drain 15621 1726882610.64915: results queue empty 15621 1726882610.64916: checking for any_errors_fatal 15621 1726882610.64917: done checking for any_errors_fatal 15621 1726882610.64918: checking for max_fail_percentage 15621 1726882610.64919: done checking for max_fail_percentage 15621 1726882610.64920: checking to see if all hosts have failed and the running result is not ok 15621 1726882610.64921: done checking to see if all hosts have failed 15621 1726882610.64923: getting the remaining hosts for this loop 15621 1726882610.64925: done getting the remaining hosts for this loop 15621 1726882610.64928: getting the next task for host managed_node3 15621 1726882610.64931: done getting next task for host managed_node3 15621 1726882610.64932: ^ task is: None 15621 1726882610.64934: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.64935: done queuing things up, now waiting for results queue to drain 15621 1726882610.64936: results queue empty 15621 1726882610.64937: checking for any_errors_fatal 15621 1726882610.64937: done checking for any_errors_fatal 15621 1726882610.64938: checking for max_fail_percentage 15621 1726882610.64939: done checking for max_fail_percentage 15621 1726882610.64939: checking to see if all hosts have failed and the running result is not ok 15621 1726882610.64940: done checking to see if all hosts have failed 15621 1726882610.64941: getting the next task for host managed_node3 15621 1726882610.64944: done getting next task for host managed_node3 15621 1726882610.64944: ^ task is: None 15621 1726882610.64945: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.64985: in VariableManager get_vars() 15621 1726882610.65002: done with get_vars() 15621 1726882610.65015: in VariableManager get_vars() 15621 1726882610.65030: done with get_vars() 15621 1726882610.65035: variable 'omit' from source: magic vars 15621 1726882610.65067: in VariableManager get_vars() 15621 1726882610.65079: done with get_vars() 15621 1726882610.65102: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15621 1726882610.65382: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882610.65406: getting the remaining hosts for this loop 15621 1726882610.65408: done getting the remaining hosts for this loop 15621 1726882610.65410: getting the next task for host managed_node3 15621 1726882610.65413: done getting next task for host managed_node3 15621 1726882610.65416: ^ task is: TASK: Gathering Facts 15621 1726882610.65417: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882610.65419: getting variables 15621 1726882610.65420: in VariableManager get_vars() 15621 1726882610.65431: Calling all_inventory to load vars for managed_node3 15621 1726882610.65433: Calling groups_inventory to load vars for managed_node3 15621 1726882610.65436: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882610.65441: Calling all_plugins_play to load vars for managed_node3 15621 1726882610.65448: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882610.65451: Calling groups_plugins_play to load vars for managed_node3 15621 1726882610.67362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882610.70047: done with get_vars() 15621 1726882610.70095: done getting variables 15621 1726882610.70279: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:36:50 -0400 (0:00:00.647) 0:00:42.782 ****** 15621 1726882610.70308: entering _queue_task() for managed_node3/gather_facts 15621 1726882610.70801: worker is 1 (out of 1 available) 15621 1726882610.70812: exiting _queue_task() for managed_node3/gather_facts 15621 1726882610.70826: done queuing things up, now waiting for results queue to drain 15621 1726882610.70828: waiting for pending results... 15621 1726882610.71240: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882610.71366: in run() - task 0affc7ec-ae25-af1a-5b92-0000000003f8 15621 1726882610.71370: variable 'ansible_search_path' from source: unknown 15621 1726882610.71378: calling self._execute() 15621 1726882610.71492: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.71507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.71523: variable 'omit' from source: magic vars 15621 1726882610.71979: variable 'ansible_distribution_major_version' from source: facts 15621 1726882610.71997: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882610.72013: variable 'omit' from source: magic vars 15621 1726882610.72057: variable 'omit' from source: magic vars 15621 1726882610.72119: variable 'omit' from source: magic vars 15621 1726882610.72155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882610.72201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882610.72327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882610.72331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882610.72336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882610.72339: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882610.72343: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.72347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.72454: Set connection var ansible_connection to ssh 15621 1726882610.72477: Set connection var ansible_shell_executable to /bin/sh 15621 1726882610.72489: Set connection var ansible_timeout to 10 15621 1726882610.72581: Set connection var ansible_shell_type to sh 15621 1726882610.72585: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882610.72588: Set connection var ansible_pipelining to False 15621 1726882610.72590: variable 'ansible_shell_executable' from source: unknown 15621 1726882610.72592: variable 'ansible_connection' from source: unknown 15621 1726882610.72594: variable 'ansible_module_compression' from source: unknown 15621 1726882610.72597: variable 'ansible_shell_type' from source: unknown 15621 1726882610.72599: variable 'ansible_shell_executable' from source: unknown 15621 1726882610.72601: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882610.72602: variable 'ansible_pipelining' from source: unknown 15621 1726882610.72605: variable 'ansible_timeout' from source: unknown 15621 1726882610.72607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882610.72838: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882610.72856: variable 'omit' from source: magic vars 15621 1726882610.72873: starting attempt loop 15621 1726882610.72881: running the handler 15621 1726882610.72911: variable 'ansible_facts' from source: unknown 15621 1726882610.72937: _low_level_execute_command(): starting 15621 1726882610.72959: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882610.74296: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.74311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.74329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.74354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.74443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.74505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882610.74520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.74581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.74705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.76701: stdout chunk (state=3): >>>/root <<< 15621 1726882610.76705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.76707: stdout chunk (state=3): >>><<< 15621 1726882610.76710: stderr chunk (state=3): >>><<< 15621 1726882610.76831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.76835: _low_level_execute_command(): starting 15621 1726882610.76839: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946 `" && echo ansible-tmp-1726882610.7673478-17166-191914773566946="` echo /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946 `" ) && sleep 0' 15621 1726882610.77932: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.78008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.78048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.78167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.80639: stdout chunk (state=3): >>>ansible-tmp-1726882610.7673478-17166-191914773566946=/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946 <<< 15621 1726882610.80642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.80645: stdout chunk (state=3): >>><<< 15621 1726882610.80647: stderr chunk (state=3): >>><<< 15621 1726882610.80650: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882610.7673478-17166-191914773566946=/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.80653: variable 'ansible_module_compression' from source: unknown 15621 1726882610.80656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882610.81030: variable 'ansible_facts' from source: unknown 15621 1726882610.81270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py 15621 1726882610.81489: Sending initial data 15621 1726882610.81499: Sent initial data (154 bytes) 15621 1726882610.82255: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.82268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.82308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.82345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882610.82382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.82474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.84118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882610.84226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882610.84491: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvyny7_3y /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py <<< 15621 1726882610.84495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py" <<< 15621 1726882610.84581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvyny7_3y" to remote "/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py" <<< 15621 1726882610.86769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.86855: stderr chunk (state=3): >>><<< 15621 1726882610.86863: stdout chunk (state=3): >>><<< 15621 1726882610.86898: done transferring module to remote 15621 1726882610.86916: _low_level_execute_command(): starting 15621 1726882610.86929: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/ /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py && sleep 0' 15621 1726882610.88164: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.88168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.88171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.88177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.88195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882610.88209: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882610.88227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882610.88245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882610.88543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.88546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.88731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882610.90939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882610.90966: stderr chunk (state=3): >>><<< 15621 1726882610.90983: stdout chunk (state=3): >>><<< 15621 1726882610.91001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882610.91011: _low_level_execute_command(): starting 15621 1726882610.91019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/AnsiballZ_setup.py && sleep 0' 15621 1726882610.91606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882610.91620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882610.91638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882610.91658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882610.91678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882610.91776: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882610.91857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882610.91951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.02921: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_is_chroot": false, "ansible_fips": false, "ansible_loadavg": {"1m": 0.65625, "5m": 0.634765625, "15m": 0.328125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3106, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 610, "free": 3106}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansib<<< 15621 1726882613.02972: stdout chunk (state=3): >>>le_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 757, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384217600, "block_size": 4096, "block_total": 64483404, "block_available": 61373100, "block_used": 3110304, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "52", "epoch": "1726882612", "epoch_int": "1726882612", "date": "2024-09-20", "time": "21:36:52", "iso8601_micro": "2024-09-21T01:36:52.985006Z", "iso8601": "2024-09-21T01:36:52Z", "iso8601_basic": "20240920T213652985006", "iso8601_basic_short": "20240920T213652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882613.05138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882613.05167: stdout chunk (state=3): >>><<< 15621 1726882613.05170: stderr chunk (state=3): >>><<< 15621 1726882613.05210: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_is_chroot": false, "ansible_fips": false, "ansible_loadavg": {"1m": 0.65625, "5m": 0.634765625, "15m": 0.328125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3106, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 610, "free": 3106}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 757, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384217600, "block_size": 4096, "block_total": 64483404, "block_available": 61373100, "block_used": 3110304, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "52", "epoch": "1726882612", "epoch_int": "1726882612", "date": "2024-09-20", "time": "21:36:52", "iso8601_micro": "2024-09-21T01:36:52.985006Z", "iso8601": "2024-09-21T01:36:52Z", "iso8601_basic": "20240920T213652985006", "iso8601_basic_short": "20240920T213652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_lsr27": {"device": "lsr27", "macaddress": "b2:cc:5d:76:bc:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b0cc:5dff:fe76:bc9a", "prefix": "64", "scope": "link"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "8a:9f:5f:c3:90:d4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::889f:5fff:fec3:90d4", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::b0cc:5dff:fe76:bc9a", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3", "fe80::889f:5fff:fec3:90d4", "fe80::b0cc:5dff:fe76:bc9a"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882613.05637: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882613.05723: _low_level_execute_command(): starting 15621 1726882613.05727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882610.7673478-17166-191914773566946/ > /dev/null 2>&1 && sleep 0' 15621 1726882613.06359: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882613.06399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882613.06414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.06510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.06538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882613.06554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.06583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.06707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.08640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.08736: stderr chunk (state=3): >>><<< 15621 1726882613.08751: stdout chunk (state=3): >>><<< 15621 1726882613.08927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882613.08931: handler run complete 15621 1726882613.08940: variable 'ansible_facts' from source: unknown 15621 1726882613.09060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.09430: variable 'ansible_facts' from source: unknown 15621 1726882613.09536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.09702: attempt loop complete, returning result 15621 1726882613.09714: _execute() done 15621 1726882613.09723: dumping result to json 15621 1726882613.09757: done dumping result, returning 15621 1726882613.09770: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-0000000003f8] 15621 1726882613.09785: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000003f8 ok: [managed_node3] 15621 1726882613.10756: no more pending results, returning what we have 15621 1726882613.10759: results queue empty 15621 1726882613.10760: checking for any_errors_fatal 15621 1726882613.10762: done checking for any_errors_fatal 15621 1726882613.10825: checking for max_fail_percentage 15621 1726882613.10829: done checking for max_fail_percentage 15621 1726882613.10830: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.10831: done checking to see if all hosts have failed 15621 1726882613.10832: getting the remaining hosts for this loop 15621 1726882613.10833: done getting the remaining hosts for this loop 15621 1726882613.10837: getting the next task for host managed_node3 15621 1726882613.10843: done getting next task for host managed_node3 15621 1726882613.10844: ^ task is: TASK: meta (flush_handlers) 15621 1726882613.10846: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.10851: getting variables 15621 1726882613.10852: in VariableManager get_vars() 15621 1726882613.10936: Calling all_inventory to load vars for managed_node3 15621 1726882613.10939: Calling groups_inventory to load vars for managed_node3 15621 1726882613.10943: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.10950: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000003f8 15621 1726882613.10953: WORKER PROCESS EXITING 15621 1726882613.10964: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.10968: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.10971: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.12740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.14919: done with get_vars() 15621 1726882613.14956: done getting variables 15621 1726882613.15035: in VariableManager get_vars() 15621 1726882613.15047: Calling all_inventory to load vars for managed_node3 15621 1726882613.15049: Calling groups_inventory to load vars for managed_node3 15621 1726882613.15052: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.15057: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.15059: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.15062: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.16908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.19258: done with get_vars() 15621 1726882613.19301: done queuing things up, now waiting for results queue to drain 15621 1726882613.19303: results queue empty 15621 1726882613.19304: checking for any_errors_fatal 15621 1726882613.19309: done checking for any_errors_fatal 15621 1726882613.19310: checking for max_fail_percentage 15621 1726882613.19316: done checking for max_fail_percentage 15621 1726882613.19317: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.19318: done checking to see if all hosts have failed 15621 1726882613.19319: getting the remaining hosts for this loop 15621 1726882613.19320: done getting the remaining hosts for this loop 15621 1726882613.19326: getting the next task for host managed_node3 15621 1726882613.19331: done getting next task for host managed_node3 15621 1726882613.19334: ^ task is: TASK: Include the task 'delete_interface.yml' 15621 1726882613.19336: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.19339: getting variables 15621 1726882613.19340: in VariableManager get_vars() 15621 1726882613.19351: Calling all_inventory to load vars for managed_node3 15621 1726882613.19353: Calling groups_inventory to load vars for managed_node3 15621 1726882613.19355: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.19362: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.19365: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.19369: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.20894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.23193: done with get_vars() 15621 1726882613.23221: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:36:53 -0400 (0:00:02.529) 0:00:45.312 ****** 15621 1726882613.23308: entering _queue_task() for managed_node3/include_tasks 15621 1726882613.23965: worker is 1 (out of 1 available) 15621 1726882613.23978: exiting _queue_task() for managed_node3/include_tasks 15621 1726882613.23991: done queuing things up, now waiting for results queue to drain 15621 1726882613.23993: waiting for pending results... 15621 1726882613.24197: running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' 15621 1726882613.24339: in run() - task 0affc7ec-ae25-af1a-5b92-000000000054 15621 1726882613.24363: variable 'ansible_search_path' from source: unknown 15621 1726882613.24410: calling self._execute() 15621 1726882613.24547: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882613.24550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882613.24553: variable 'omit' from source: magic vars 15621 1726882613.24948: variable 'ansible_distribution_major_version' from source: facts 15621 1726882613.24966: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882613.24982: _execute() done 15621 1726882613.24991: dumping result to json 15621 1726882613.25000: done dumping result, returning 15621 1726882613.25127: done running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' [0affc7ec-ae25-af1a-5b92-000000000054] 15621 1726882613.25131: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000054 15621 1726882613.25214: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000054 15621 1726882613.25217: WORKER PROCESS EXITING 15621 1726882613.25253: no more pending results, returning what we have 15621 1726882613.25258: in VariableManager get_vars() 15621 1726882613.25297: Calling all_inventory to load vars for managed_node3 15621 1726882613.25300: Calling groups_inventory to load vars for managed_node3 15621 1726882613.25305: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.25325: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.25329: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.25333: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.27242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.29318: done with get_vars() 15621 1726882613.29347: variable 'ansible_search_path' from source: unknown 15621 1726882613.29364: we have included files to process 15621 1726882613.29365: generating all_blocks data 15621 1726882613.29366: done generating all_blocks data 15621 1726882613.29367: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15621 1726882613.29368: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15621 1726882613.29370: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15621 1726882613.29615: done processing included file 15621 1726882613.29618: iterating over new_blocks loaded from include file 15621 1726882613.29619: in VariableManager get_vars() 15621 1726882613.29636: done with get_vars() 15621 1726882613.29638: filtering new block on tags 15621 1726882613.29654: done filtering new block on tags 15621 1726882613.29656: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 15621 1726882613.29662: extending task lists for all hosts with included blocks 15621 1726882613.29696: done extending task lists 15621 1726882613.29697: done processing included files 15621 1726882613.29698: results queue empty 15621 1726882613.29699: checking for any_errors_fatal 15621 1726882613.29701: done checking for any_errors_fatal 15621 1726882613.29702: checking for max_fail_percentage 15621 1726882613.29703: done checking for max_fail_percentage 15621 1726882613.29704: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.29705: done checking to see if all hosts have failed 15621 1726882613.29705: getting the remaining hosts for this loop 15621 1726882613.29707: done getting the remaining hosts for this loop 15621 1726882613.29709: getting the next task for host managed_node3 15621 1726882613.29713: done getting next task for host managed_node3 15621 1726882613.29715: ^ task is: TASK: Remove test interface if necessary 15621 1726882613.29717: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.29720: getting variables 15621 1726882613.29721: in VariableManager get_vars() 15621 1726882613.29733: Calling all_inventory to load vars for managed_node3 15621 1726882613.29736: Calling groups_inventory to load vars for managed_node3 15621 1726882613.29739: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.29745: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.29747: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.29751: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.31379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.33404: done with get_vars() 15621 1726882613.33438: done getting variables 15621 1726882613.33488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.102) 0:00:45.414 ****** 15621 1726882613.33524: entering _queue_task() for managed_node3/command 15621 1726882613.33902: worker is 1 (out of 1 available) 15621 1726882613.33916: exiting _queue_task() for managed_node3/command 15621 1726882613.33933: done queuing things up, now waiting for results queue to drain 15621 1726882613.33935: waiting for pending results... 15621 1726882613.34342: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 15621 1726882613.34418: in run() - task 0affc7ec-ae25-af1a-5b92-000000000409 15621 1726882613.34442: variable 'ansible_search_path' from source: unknown 15621 1726882613.34484: variable 'ansible_search_path' from source: unknown 15621 1726882613.34502: calling self._execute() 15621 1726882613.34596: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882613.34612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882613.34626: variable 'omit' from source: magic vars 15621 1726882613.35059: variable 'ansible_distribution_major_version' from source: facts 15621 1726882613.35129: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882613.35132: variable 'omit' from source: magic vars 15621 1726882613.35135: variable 'omit' from source: magic vars 15621 1726882613.35252: variable 'interface' from source: set_fact 15621 1726882613.35281: variable 'omit' from source: magic vars 15621 1726882613.35331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882613.35389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882613.35414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882613.35442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882613.35565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882613.35569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882613.35575: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882613.35578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882613.35649: Set connection var ansible_connection to ssh 15621 1726882613.35665: Set connection var ansible_shell_executable to /bin/sh 15621 1726882613.35685: Set connection var ansible_timeout to 10 15621 1726882613.35698: Set connection var ansible_shell_type to sh 15621 1726882613.35782: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882613.35786: Set connection var ansible_pipelining to False 15621 1726882613.35788: variable 'ansible_shell_executable' from source: unknown 15621 1726882613.35791: variable 'ansible_connection' from source: unknown 15621 1726882613.35794: variable 'ansible_module_compression' from source: unknown 15621 1726882613.35801: variable 'ansible_shell_type' from source: unknown 15621 1726882613.35803: variable 'ansible_shell_executable' from source: unknown 15621 1726882613.35806: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882613.35808: variable 'ansible_pipelining' from source: unknown 15621 1726882613.35810: variable 'ansible_timeout' from source: unknown 15621 1726882613.35812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882613.35977: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882613.35999: variable 'omit' from source: magic vars 15621 1726882613.36011: starting attempt loop 15621 1726882613.36024: running the handler 15621 1726882613.36045: _low_level_execute_command(): starting 15621 1726882613.36058: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882613.37137: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.37141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.37144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.37146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.37215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.37218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.37327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.39111: stdout chunk (state=3): >>>/root <<< 15621 1726882613.39314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.39317: stdout chunk (state=3): >>><<< 15621 1726882613.39319: stderr chunk (state=3): >>><<< 15621 1726882613.39341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882613.39359: _low_level_execute_command(): starting 15621 1726882613.39368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058 `" && echo ansible-tmp-1726882613.393471-17287-22471504016058="` echo /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058 `" ) && sleep 0' 15621 1726882613.40009: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882613.40030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882613.40046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.40097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882613.40113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.40211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.40254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.40351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.42324: stdout chunk (state=3): >>>ansible-tmp-1726882613.393471-17287-22471504016058=/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058 <<< 15621 1726882613.42548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.42551: stdout chunk (state=3): >>><<< 15621 1726882613.42554: stderr chunk (state=3): >>><<< 15621 1726882613.42729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882613.393471-17287-22471504016058=/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882613.42732: variable 'ansible_module_compression' from source: unknown 15621 1726882613.42735: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882613.42738: variable 'ansible_facts' from source: unknown 15621 1726882613.42815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py 15621 1726882613.42987: Sending initial data 15621 1726882613.43090: Sent initial data (154 bytes) 15621 1726882613.43742: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882613.43754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.43852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.43883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882613.43907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.44025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.45642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882613.45742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882613.45853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpu0engwze /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py <<< 15621 1726882613.45871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py" <<< 15621 1726882613.45944: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpu0engwze" to remote "/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py" <<< 15621 1726882613.46950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.47004: stderr chunk (state=3): >>><<< 15621 1726882613.47017: stdout chunk (state=3): >>><<< 15621 1726882613.47061: done transferring module to remote 15621 1726882613.47086: _low_level_execute_command(): starting 15621 1726882613.47096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/ /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py && sleep 0' 15621 1726882613.47745: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882613.47748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882613.47841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.47878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882613.47897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.47918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.48049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.49996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.50032: stdout chunk (state=3): >>><<< 15621 1726882613.50036: stderr chunk (state=3): >>><<< 15621 1726882613.50145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882613.50148: _low_level_execute_command(): starting 15621 1726882613.50150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/AnsiballZ_command.py && sleep 0' 15621 1726882613.50984: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882613.50987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.50990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882613.50997: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.51042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.51046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.51062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.51169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.68460: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:36:53.674120", "end": "2024-09-20 21:36:53.680787", "delta": "0:00:00.006667", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882613.71053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882613.71057: stdout chunk (state=3): >>><<< 15621 1726882613.71060: stderr chunk (state=3): >>><<< 15621 1726882613.71086: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:36:53.674120", "end": "2024-09-20 21:36:53.680787", "delta": "0:00:00.006667", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882613.71235: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882613.71239: _low_level_execute_command(): starting 15621 1726882613.71242: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882613.393471-17287-22471504016058/ > /dev/null 2>&1 && sleep 0' 15621 1726882613.71894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882613.71916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882613.71936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882613.72012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882613.72066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882613.72090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882613.72131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882613.72262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882613.74248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882613.74329: stderr chunk (state=3): >>><<< 15621 1726882613.74345: stdout chunk (state=3): >>><<< 15621 1726882613.74527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882613.74531: handler run complete 15621 1726882613.74534: Evaluated conditional (False): False 15621 1726882613.74536: attempt loop complete, returning result 15621 1726882613.74538: _execute() done 15621 1726882613.74540: dumping result to json 15621 1726882613.74542: done dumping result, returning 15621 1726882613.74544: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0affc7ec-ae25-af1a-5b92-000000000409] 15621 1726882613.74546: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000409 15621 1726882613.74628: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000409 15621 1726882613.74632: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.006667", "end": "2024-09-20 21:36:53.680787", "rc": 0, "start": "2024-09-20 21:36:53.674120" } 15621 1726882613.74706: no more pending results, returning what we have 15621 1726882613.74711: results queue empty 15621 1726882613.74712: checking for any_errors_fatal 15621 1726882613.74714: done checking for any_errors_fatal 15621 1726882613.74715: checking for max_fail_percentage 15621 1726882613.74716: done checking for max_fail_percentage 15621 1726882613.74717: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.74719: done checking to see if all hosts have failed 15621 1726882613.74719: getting the remaining hosts for this loop 15621 1726882613.74721: done getting the remaining hosts for this loop 15621 1726882613.74728: getting the next task for host managed_node3 15621 1726882613.74738: done getting next task for host managed_node3 15621 1726882613.74741: ^ task is: TASK: meta (flush_handlers) 15621 1726882613.74743: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.74748: getting variables 15621 1726882613.74750: in VariableManager get_vars() 15621 1726882613.74783: Calling all_inventory to load vars for managed_node3 15621 1726882613.74786: Calling groups_inventory to load vars for managed_node3 15621 1726882613.74791: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.74806: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.74810: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.74813: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.77109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.79292: done with get_vars() 15621 1726882613.79328: done getting variables 15621 1726882613.79411: in VariableManager get_vars() 15621 1726882613.79423: Calling all_inventory to load vars for managed_node3 15621 1726882613.79427: Calling groups_inventory to load vars for managed_node3 15621 1726882613.79429: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.79435: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.79437: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.79440: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.80898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.83067: done with get_vars() 15621 1726882613.83114: done queuing things up, now waiting for results queue to drain 15621 1726882613.83117: results queue empty 15621 1726882613.83118: checking for any_errors_fatal 15621 1726882613.83124: done checking for any_errors_fatal 15621 1726882613.83125: checking for max_fail_percentage 15621 1726882613.83128: done checking for max_fail_percentage 15621 1726882613.83129: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.83129: done checking to see if all hosts have failed 15621 1726882613.83130: getting the remaining hosts for this loop 15621 1726882613.83131: done getting the remaining hosts for this loop 15621 1726882613.83134: getting the next task for host managed_node3 15621 1726882613.83139: done getting next task for host managed_node3 15621 1726882613.83141: ^ task is: TASK: meta (flush_handlers) 15621 1726882613.83143: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.83145: getting variables 15621 1726882613.83146: in VariableManager get_vars() 15621 1726882613.83158: Calling all_inventory to load vars for managed_node3 15621 1726882613.83161: Calling groups_inventory to load vars for managed_node3 15621 1726882613.83163: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.83169: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.83172: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.83175: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.84729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.86791: done with get_vars() 15621 1726882613.86836: done getting variables 15621 1726882613.86895: in VariableManager get_vars() 15621 1726882613.86907: Calling all_inventory to load vars for managed_node3 15621 1726882613.86910: Calling groups_inventory to load vars for managed_node3 15621 1726882613.86912: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.86918: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.86921: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.86926: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.88578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882613.95473: done with get_vars() 15621 1726882613.95508: done queuing things up, now waiting for results queue to drain 15621 1726882613.95515: results queue empty 15621 1726882613.95516: checking for any_errors_fatal 15621 1726882613.95517: done checking for any_errors_fatal 15621 1726882613.95518: checking for max_fail_percentage 15621 1726882613.95519: done checking for max_fail_percentage 15621 1726882613.95520: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.95521: done checking to see if all hosts have failed 15621 1726882613.95524: getting the remaining hosts for this loop 15621 1726882613.95525: done getting the remaining hosts for this loop 15621 1726882613.95527: getting the next task for host managed_node3 15621 1726882613.95531: done getting next task for host managed_node3 15621 1726882613.95532: ^ task is: None 15621 1726882613.95533: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.95534: done queuing things up, now waiting for results queue to drain 15621 1726882613.95535: results queue empty 15621 1726882613.95536: checking for any_errors_fatal 15621 1726882613.95537: done checking for any_errors_fatal 15621 1726882613.95537: checking for max_fail_percentage 15621 1726882613.95538: done checking for max_fail_percentage 15621 1726882613.95539: checking to see if all hosts have failed and the running result is not ok 15621 1726882613.95540: done checking to see if all hosts have failed 15621 1726882613.95541: getting the next task for host managed_node3 15621 1726882613.95543: done getting next task for host managed_node3 15621 1726882613.95544: ^ task is: None 15621 1726882613.95545: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.95579: in VariableManager get_vars() 15621 1726882613.95599: done with get_vars() 15621 1726882613.95604: in VariableManager get_vars() 15621 1726882613.95616: done with get_vars() 15621 1726882613.95620: variable 'omit' from source: magic vars 15621 1726882613.95732: variable 'profile' from source: play vars 15621 1726882613.95832: in VariableManager get_vars() 15621 1726882613.95847: done with get_vars() 15621 1726882613.95867: variable 'omit' from source: magic vars 15621 1726882613.95936: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15621 1726882613.96729: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882613.96757: getting the remaining hosts for this loop 15621 1726882613.96759: done getting the remaining hosts for this loop 15621 1726882613.96762: getting the next task for host managed_node3 15621 1726882613.96764: done getting next task for host managed_node3 15621 1726882613.96766: ^ task is: TASK: Gathering Facts 15621 1726882613.96768: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882613.96770: getting variables 15621 1726882613.96771: in VariableManager get_vars() 15621 1726882613.96783: Calling all_inventory to load vars for managed_node3 15621 1726882613.96785: Calling groups_inventory to load vars for managed_node3 15621 1726882613.96788: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882613.96793: Calling all_plugins_play to load vars for managed_node3 15621 1726882613.96796: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882613.96799: Calling groups_plugins_play to load vars for managed_node3 15621 1726882613.98315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882614.00471: done with get_vars() 15621 1726882614.00506: done getting variables 15621 1726882614.00567: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:36:54 -0400 (0:00:00.670) 0:00:46.085 ****** 15621 1726882614.00594: entering _queue_task() for managed_node3/gather_facts 15621 1726882614.00983: worker is 1 (out of 1 available) 15621 1726882614.01001: exiting _queue_task() for managed_node3/gather_facts 15621 1726882614.01014: done queuing things up, now waiting for results queue to drain 15621 1726882614.01016: waiting for pending results... 15621 1726882614.01260: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882614.01434: in run() - task 0affc7ec-ae25-af1a-5b92-000000000417 15621 1726882614.01439: variable 'ansible_search_path' from source: unknown 15621 1726882614.01442: calling self._execute() 15621 1726882614.01536: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882614.01549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882614.01565: variable 'omit' from source: magic vars 15621 1726882614.01993: variable 'ansible_distribution_major_version' from source: facts 15621 1726882614.02017: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882614.02034: variable 'omit' from source: magic vars 15621 1726882614.02068: variable 'omit' from source: magic vars 15621 1726882614.02229: variable 'omit' from source: magic vars 15621 1726882614.02235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882614.02238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882614.02250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882614.02279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882614.02297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882614.02339: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882614.02350: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882614.02428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882614.02497: Set connection var ansible_connection to ssh 15621 1726882614.02513: Set connection var ansible_shell_executable to /bin/sh 15621 1726882614.02528: Set connection var ansible_timeout to 10 15621 1726882614.02536: Set connection var ansible_shell_type to sh 15621 1726882614.02547: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882614.02557: Set connection var ansible_pipelining to False 15621 1726882614.02596: variable 'ansible_shell_executable' from source: unknown 15621 1726882614.02606: variable 'ansible_connection' from source: unknown 15621 1726882614.02614: variable 'ansible_module_compression' from source: unknown 15621 1726882614.02624: variable 'ansible_shell_type' from source: unknown 15621 1726882614.02633: variable 'ansible_shell_executable' from source: unknown 15621 1726882614.02641: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882614.02684: variable 'ansible_pipelining' from source: unknown 15621 1726882614.02687: variable 'ansible_timeout' from source: unknown 15621 1726882614.02690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882614.02880: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882614.02905: variable 'omit' from source: magic vars 15621 1726882614.02916: starting attempt loop 15621 1726882614.02926: running the handler 15621 1726882614.03010: variable 'ansible_facts' from source: unknown 15621 1726882614.03014: _low_level_execute_command(): starting 15621 1726882614.03017: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882614.03966: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882614.03988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882614.04042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882614.04119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882614.04151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882614.04275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882614.06050: stdout chunk (state=3): >>>/root <<< 15621 1726882614.06265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882614.06269: stdout chunk (state=3): >>><<< 15621 1726882614.06271: stderr chunk (state=3): >>><<< 15621 1726882614.06392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882614.06396: _low_level_execute_command(): starting 15621 1726882614.06399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278 `" && echo ansible-tmp-1726882614.062931-17310-20294765529278="` echo /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278 `" ) && sleep 0' 15621 1726882614.07004: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882614.07036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882614.07051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882614.07176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882614.09153: stdout chunk (state=3): >>>ansible-tmp-1726882614.062931-17310-20294765529278=/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278 <<< 15621 1726882614.09286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882614.09386: stderr chunk (state=3): >>><<< 15621 1726882614.09399: stdout chunk (state=3): >>><<< 15621 1726882614.09629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882614.062931-17310-20294765529278=/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882614.09633: variable 'ansible_module_compression' from source: unknown 15621 1726882614.09636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882614.09639: variable 'ansible_facts' from source: unknown 15621 1726882614.09828: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py 15621 1726882614.10110: Sending initial data 15621 1726882614.10114: Sent initial data (152 bytes) 15621 1726882614.10787: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882614.10809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882614.10861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882614.10929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882614.10946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882614.10974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882614.11092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882614.12697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882614.12806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882614.12911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7h4upusk /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py <<< 15621 1726882614.12915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py" <<< 15621 1726882614.12988: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp7h4upusk" to remote "/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py" <<< 15621 1726882614.14844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882614.14896: stderr chunk (state=3): >>><<< 15621 1726882614.14907: stdout chunk (state=3): >>><<< 15621 1726882614.15042: done transferring module to remote 15621 1726882614.15045: _low_level_execute_command(): starting 15621 1726882614.15047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/ /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py && sleep 0' 15621 1726882614.15629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882614.15642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882614.15656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882614.15675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882614.15787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882614.15811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882614.15930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882614.17841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882614.17852: stdout chunk (state=3): >>><<< 15621 1726882614.17869: stderr chunk (state=3): >>><<< 15621 1726882614.17929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882614.17933: _low_level_execute_command(): starting 15621 1726882614.17938: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/AnsiballZ_setup.py && sleep 0' 15621 1726882614.18613: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882614.18633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882614.18649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882614.18667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882614.18694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882614.18803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882614.18828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882614.18948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.26446: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3117, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 599, "free": 3117}, "nocache": {"free": 3500, "used": 216}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 760, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384197120, "block_size": 4096, "block_total": 64483404, "block_available": 61373095, "block_used": 3110309, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.603515625, "5m": 0.6240234375, "15m": 0.326171875}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "56", "epoch": "1726882616", "epoch_int": "1726882616", "date": "2024-09-20", "time": "21:36:56", "iso8601_micro": "2024-09-21T01:36:56.260597Z", "iso8601": "2024-09-21T01:36:56Z", "iso8601_basic": "20240920T213656260597", "iso8601_basic_short": "20240920T213656", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882616.28596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882616.28675: stderr chunk (state=3): >>><<< 15621 1726882616.28680: stdout chunk (state=3): >>><<< 15621 1726882616.28829: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3117, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 599, "free": 3117}, "nocache": {"free": 3500, "used": 216}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 760, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384197120, "block_size": 4096, "block_total": 64483404, "block_available": 61373095, "block_used": 3110309, "inode_total": 16384000, "inode_available": 16303143, "inode_used": 80857, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.603515625, "5m": 0.6240234375, "15m": 0.326171875}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "56", "epoch": "1726882616", "epoch_int": "1726882616", "date": "2024-09-20", "time": "21:36:56", "iso8601_micro": "2024-09-21T01:36:56.260597Z", "iso8601": "2024-09-21T01:36:56Z", "iso8601_basic": "20240920T213656260597", "iso8601_basic_short": "20240920T213656", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882616.29089: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882616.29120: _low_level_execute_command(): starting 15621 1726882616.29135: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882614.062931-17310-20294765529278/ > /dev/null 2>&1 && sleep 0' 15621 1726882616.29841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.29884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882616.29904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.29930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.30056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.32008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882616.32053: stderr chunk (state=3): >>><<< 15621 1726882616.32056: stdout chunk (state=3): >>><<< 15621 1726882616.32068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882616.32078: handler run complete 15621 1726882616.32157: variable 'ansible_facts' from source: unknown 15621 1726882616.32226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.32421: variable 'ansible_facts' from source: unknown 15621 1726882616.32481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.32558: attempt loop complete, returning result 15621 1726882616.32562: _execute() done 15621 1726882616.32566: dumping result to json 15621 1726882616.32586: done dumping result, returning 15621 1726882616.32593: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-000000000417] 15621 1726882616.32599: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000417 15621 1726882616.32858: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000417 15621 1726882616.32861: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882616.33169: no more pending results, returning what we have 15621 1726882616.33175: results queue empty 15621 1726882616.33176: checking for any_errors_fatal 15621 1726882616.33177: done checking for any_errors_fatal 15621 1726882616.33178: checking for max_fail_percentage 15621 1726882616.33179: done checking for max_fail_percentage 15621 1726882616.33180: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.33182: done checking to see if all hosts have failed 15621 1726882616.33182: getting the remaining hosts for this loop 15621 1726882616.33183: done getting the remaining hosts for this loop 15621 1726882616.33187: getting the next task for host managed_node3 15621 1726882616.33191: done getting next task for host managed_node3 15621 1726882616.33193: ^ task is: TASK: meta (flush_handlers) 15621 1726882616.33195: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.33198: getting variables 15621 1726882616.33199: in VariableManager get_vars() 15621 1726882616.33232: Calling all_inventory to load vars for managed_node3 15621 1726882616.33234: Calling groups_inventory to load vars for managed_node3 15621 1726882616.33236: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.33247: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.33249: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.33252: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.34466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.36668: done with get_vars() 15621 1726882616.36701: done getting variables 15621 1726882616.36786: in VariableManager get_vars() 15621 1726882616.36801: Calling all_inventory to load vars for managed_node3 15621 1726882616.36804: Calling groups_inventory to load vars for managed_node3 15621 1726882616.36806: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.36812: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.36814: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.36817: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.38491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.40731: done with get_vars() 15621 1726882616.40764: done queuing things up, now waiting for results queue to drain 15621 1726882616.40766: results queue empty 15621 1726882616.40767: checking for any_errors_fatal 15621 1726882616.40771: done checking for any_errors_fatal 15621 1726882616.40775: checking for max_fail_percentage 15621 1726882616.40776: done checking for max_fail_percentage 15621 1726882616.40777: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.40778: done checking to see if all hosts have failed 15621 1726882616.40783: getting the remaining hosts for this loop 15621 1726882616.40784: done getting the remaining hosts for this loop 15621 1726882616.40788: getting the next task for host managed_node3 15621 1726882616.40792: done getting next task for host managed_node3 15621 1726882616.40795: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882616.40797: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.40808: getting variables 15621 1726882616.40809: in VariableManager get_vars() 15621 1726882616.40827: Calling all_inventory to load vars for managed_node3 15621 1726882616.40830: Calling groups_inventory to load vars for managed_node3 15621 1726882616.40832: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.40838: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.40841: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.40844: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.43727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.46582: done with get_vars() 15621 1726882616.46607: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:56 -0400 (0:00:02.460) 0:00:48.546 ****** 15621 1726882616.46697: entering _queue_task() for managed_node3/include_tasks 15621 1726882616.47075: worker is 1 (out of 1 available) 15621 1726882616.47089: exiting _queue_task() for managed_node3/include_tasks 15621 1726882616.47102: done queuing things up, now waiting for results queue to drain 15621 1726882616.47104: waiting for pending results... 15621 1726882616.47654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15621 1726882616.47660: in run() - task 0affc7ec-ae25-af1a-5b92-00000000005c 15621 1726882616.47663: variable 'ansible_search_path' from source: unknown 15621 1726882616.47666: variable 'ansible_search_path' from source: unknown 15621 1726882616.47668: calling self._execute() 15621 1726882616.47732: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.47751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.47767: variable 'omit' from source: magic vars 15621 1726882616.48195: variable 'ansible_distribution_major_version' from source: facts 15621 1726882616.48212: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882616.48227: _execute() done 15621 1726882616.48236: dumping result to json 15621 1726882616.48245: done dumping result, returning 15621 1726882616.48258: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-af1a-5b92-00000000005c] 15621 1726882616.48292: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005c 15621 1726882616.48439: no more pending results, returning what we have 15621 1726882616.48445: in VariableManager get_vars() 15621 1726882616.48494: Calling all_inventory to load vars for managed_node3 15621 1726882616.48498: Calling groups_inventory to load vars for managed_node3 15621 1726882616.48500: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.48517: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.48520: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.48526: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.49339: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005c 15621 1726882616.49343: WORKER PROCESS EXITING 15621 1726882616.50534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.52658: done with get_vars() 15621 1726882616.52684: variable 'ansible_search_path' from source: unknown 15621 1726882616.52685: variable 'ansible_search_path' from source: unknown 15621 1726882616.52714: we have included files to process 15621 1726882616.52716: generating all_blocks data 15621 1726882616.52717: done generating all_blocks data 15621 1726882616.52718: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882616.52719: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882616.52724: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15621 1726882616.53357: done processing included file 15621 1726882616.53359: iterating over new_blocks loaded from include file 15621 1726882616.53361: in VariableManager get_vars() 15621 1726882616.53386: done with get_vars() 15621 1726882616.53388: filtering new block on tags 15621 1726882616.53405: done filtering new block on tags 15621 1726882616.53407: in VariableManager get_vars() 15621 1726882616.53430: done with get_vars() 15621 1726882616.53432: filtering new block on tags 15621 1726882616.53452: done filtering new block on tags 15621 1726882616.53455: in VariableManager get_vars() 15621 1726882616.53479: done with get_vars() 15621 1726882616.53481: filtering new block on tags 15621 1726882616.53499: done filtering new block on tags 15621 1726882616.53502: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15621 1726882616.53507: extending task lists for all hosts with included blocks 15621 1726882616.53942: done extending task lists 15621 1726882616.53944: done processing included files 15621 1726882616.53945: results queue empty 15621 1726882616.53945: checking for any_errors_fatal 15621 1726882616.53947: done checking for any_errors_fatal 15621 1726882616.53948: checking for max_fail_percentage 15621 1726882616.53949: done checking for max_fail_percentage 15621 1726882616.53950: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.53951: done checking to see if all hosts have failed 15621 1726882616.53952: getting the remaining hosts for this loop 15621 1726882616.53953: done getting the remaining hosts for this loop 15621 1726882616.53955: getting the next task for host managed_node3 15621 1726882616.53959: done getting next task for host managed_node3 15621 1726882616.53962: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882616.53964: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.53976: getting variables 15621 1726882616.53977: in VariableManager get_vars() 15621 1726882616.53991: Calling all_inventory to load vars for managed_node3 15621 1726882616.53994: Calling groups_inventory to load vars for managed_node3 15621 1726882616.53996: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.54001: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.54004: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.54007: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.55555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.57646: done with get_vars() 15621 1726882616.57670: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:56 -0400 (0:00:00.110) 0:00:48.656 ****** 15621 1726882616.57750: entering _queue_task() for managed_node3/setup 15621 1726882616.58110: worker is 1 (out of 1 available) 15621 1726882616.58329: exiting _queue_task() for managed_node3/setup 15621 1726882616.58339: done queuing things up, now waiting for results queue to drain 15621 1726882616.58341: waiting for pending results... 15621 1726882616.58443: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15621 1726882616.58601: in run() - task 0affc7ec-ae25-af1a-5b92-000000000458 15621 1726882616.58625: variable 'ansible_search_path' from source: unknown 15621 1726882616.58635: variable 'ansible_search_path' from source: unknown 15621 1726882616.58687: calling self._execute() 15621 1726882616.58792: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.58806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.58820: variable 'omit' from source: magic vars 15621 1726882616.59233: variable 'ansible_distribution_major_version' from source: facts 15621 1726882616.59251: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882616.59503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882616.61951: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882616.62030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882616.62082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882616.62126: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882616.62164: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882616.62261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882616.62304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882616.62341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882616.62395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882616.62417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882616.62489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882616.62519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882616.62554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882616.62609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882616.62633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882616.62813: variable '__network_required_facts' from source: role '' defaults 15621 1726882616.62918: variable 'ansible_facts' from source: unknown 15621 1726882616.63803: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15621 1726882616.63812: when evaluation is False, skipping this task 15621 1726882616.63820: _execute() done 15621 1726882616.63831: dumping result to json 15621 1726882616.63841: done dumping result, returning 15621 1726882616.63852: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-af1a-5b92-000000000458] 15621 1726882616.63861: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000458 15621 1726882616.64129: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000458 15621 1726882616.64133: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882616.64184: no more pending results, returning what we have 15621 1726882616.64188: results queue empty 15621 1726882616.64190: checking for any_errors_fatal 15621 1726882616.64191: done checking for any_errors_fatal 15621 1726882616.64192: checking for max_fail_percentage 15621 1726882616.64194: done checking for max_fail_percentage 15621 1726882616.64194: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.64196: done checking to see if all hosts have failed 15621 1726882616.64197: getting the remaining hosts for this loop 15621 1726882616.64198: done getting the remaining hosts for this loop 15621 1726882616.64204: getting the next task for host managed_node3 15621 1726882616.64213: done getting next task for host managed_node3 15621 1726882616.64217: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882616.64220: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.64238: getting variables 15621 1726882616.64240: in VariableManager get_vars() 15621 1726882616.64286: Calling all_inventory to load vars for managed_node3 15621 1726882616.64289: Calling groups_inventory to load vars for managed_node3 15621 1726882616.64292: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.64304: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.64307: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.64311: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.66152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.68433: done with get_vars() 15621 1726882616.68458: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:56 -0400 (0:00:00.108) 0:00:48.764 ****** 15621 1726882616.68565: entering _queue_task() for managed_node3/stat 15621 1726882616.68898: worker is 1 (out of 1 available) 15621 1726882616.68913: exiting _queue_task() for managed_node3/stat 15621 1726882616.69128: done queuing things up, now waiting for results queue to drain 15621 1726882616.69130: waiting for pending results... 15621 1726882616.69220: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15621 1726882616.69378: in run() - task 0affc7ec-ae25-af1a-5b92-00000000045a 15621 1726882616.69402: variable 'ansible_search_path' from source: unknown 15621 1726882616.69412: variable 'ansible_search_path' from source: unknown 15621 1726882616.69461: calling self._execute() 15621 1726882616.69563: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.69584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.69690: variable 'omit' from source: magic vars 15621 1726882616.70026: variable 'ansible_distribution_major_version' from source: facts 15621 1726882616.70044: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882616.70235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882616.70531: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882616.70590: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882616.70632: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882616.70678: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882616.70782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882616.70816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882616.70852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882616.70892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882616.70993: variable '__network_is_ostree' from source: set_fact 15621 1726882616.71007: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882616.71015: when evaluation is False, skipping this task 15621 1726882616.71101: _execute() done 15621 1726882616.71105: dumping result to json 15621 1726882616.71108: done dumping result, returning 15621 1726882616.71110: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-af1a-5b92-00000000045a] 15621 1726882616.71113: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045a 15621 1726882616.71181: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045a 15621 1726882616.71185: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882616.71258: no more pending results, returning what we have 15621 1726882616.71261: results queue empty 15621 1726882616.71262: checking for any_errors_fatal 15621 1726882616.71268: done checking for any_errors_fatal 15621 1726882616.71269: checking for max_fail_percentage 15621 1726882616.71270: done checking for max_fail_percentage 15621 1726882616.71271: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.71275: done checking to see if all hosts have failed 15621 1726882616.71276: getting the remaining hosts for this loop 15621 1726882616.71277: done getting the remaining hosts for this loop 15621 1726882616.71282: getting the next task for host managed_node3 15621 1726882616.71288: done getting next task for host managed_node3 15621 1726882616.71292: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882616.71295: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.71310: getting variables 15621 1726882616.71312: in VariableManager get_vars() 15621 1726882616.71354: Calling all_inventory to load vars for managed_node3 15621 1726882616.71357: Calling groups_inventory to load vars for managed_node3 15621 1726882616.71360: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.71375: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.71378: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.71382: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.73243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.75367: done with get_vars() 15621 1726882616.75398: done getting variables 15621 1726882616.75465: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:56 -0400 (0:00:00.069) 0:00:48.834 ****** 15621 1726882616.75507: entering _queue_task() for managed_node3/set_fact 15621 1726882616.76049: worker is 1 (out of 1 available) 15621 1726882616.76060: exiting _queue_task() for managed_node3/set_fact 15621 1726882616.76070: done queuing things up, now waiting for results queue to drain 15621 1726882616.76075: waiting for pending results... 15621 1726882616.76315: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15621 1726882616.76362: in run() - task 0affc7ec-ae25-af1a-5b92-00000000045b 15621 1726882616.76392: variable 'ansible_search_path' from source: unknown 15621 1726882616.76402: variable 'ansible_search_path' from source: unknown 15621 1726882616.76457: calling self._execute() 15621 1726882616.76559: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.76576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.76593: variable 'omit' from source: magic vars 15621 1726882616.76995: variable 'ansible_distribution_major_version' from source: facts 15621 1726882616.77013: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882616.77210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882616.77611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882616.77615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882616.77617: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882616.77642: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882616.77748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882616.77783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882616.77816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882616.77854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882616.77959: variable '__network_is_ostree' from source: set_fact 15621 1726882616.77971: Evaluated conditional (not __network_is_ostree is defined): False 15621 1726882616.77983: when evaluation is False, skipping this task 15621 1726882616.77990: _execute() done 15621 1726882616.77998: dumping result to json 15621 1726882616.78006: done dumping result, returning 15621 1726882616.78017: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-af1a-5b92-00000000045b] 15621 1726882616.78030: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045b skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15621 1726882616.78203: no more pending results, returning what we have 15621 1726882616.78207: results queue empty 15621 1726882616.78209: checking for any_errors_fatal 15621 1726882616.78215: done checking for any_errors_fatal 15621 1726882616.78216: checking for max_fail_percentage 15621 1726882616.78217: done checking for max_fail_percentage 15621 1726882616.78218: checking to see if all hosts have failed and the running result is not ok 15621 1726882616.78220: done checking to see if all hosts have failed 15621 1726882616.78220: getting the remaining hosts for this loop 15621 1726882616.78224: done getting the remaining hosts for this loop 15621 1726882616.78228: getting the next task for host managed_node3 15621 1726882616.78238: done getting next task for host managed_node3 15621 1726882616.78242: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882616.78245: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882616.78261: getting variables 15621 1726882616.78263: in VariableManager get_vars() 15621 1726882616.78309: Calling all_inventory to load vars for managed_node3 15621 1726882616.78312: Calling groups_inventory to load vars for managed_node3 15621 1726882616.78314: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882616.78627: Calling all_plugins_play to load vars for managed_node3 15621 1726882616.78631: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882616.78638: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045b 15621 1726882616.78641: WORKER PROCESS EXITING 15621 1726882616.78645: Calling groups_plugins_play to load vars for managed_node3 15621 1726882616.80447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882616.82587: done with get_vars() 15621 1726882616.82615: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:56 -0400 (0:00:00.072) 0:00:48.906 ****** 15621 1726882616.82725: entering _queue_task() for managed_node3/service_facts 15621 1726882616.83085: worker is 1 (out of 1 available) 15621 1726882616.83099: exiting _queue_task() for managed_node3/service_facts 15621 1726882616.83112: done queuing things up, now waiting for results queue to drain 15621 1726882616.83114: waiting for pending results... 15621 1726882616.83417: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15621 1726882616.83577: in run() - task 0affc7ec-ae25-af1a-5b92-00000000045d 15621 1726882616.83599: variable 'ansible_search_path' from source: unknown 15621 1726882616.83607: variable 'ansible_search_path' from source: unknown 15621 1726882616.83657: calling self._execute() 15621 1726882616.83760: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.83778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.83794: variable 'omit' from source: magic vars 15621 1726882616.84204: variable 'ansible_distribution_major_version' from source: facts 15621 1726882616.84224: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882616.84237: variable 'omit' from source: magic vars 15621 1726882616.84300: variable 'omit' from source: magic vars 15621 1726882616.84347: variable 'omit' from source: magic vars 15621 1726882616.84417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882616.84445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882616.84469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882616.84727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882616.84731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882616.84734: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882616.84736: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.84738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.84741: Set connection var ansible_connection to ssh 15621 1726882616.84743: Set connection var ansible_shell_executable to /bin/sh 15621 1726882616.84745: Set connection var ansible_timeout to 10 15621 1726882616.84748: Set connection var ansible_shell_type to sh 15621 1726882616.84750: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882616.84752: Set connection var ansible_pipelining to False 15621 1726882616.84754: variable 'ansible_shell_executable' from source: unknown 15621 1726882616.84756: variable 'ansible_connection' from source: unknown 15621 1726882616.84759: variable 'ansible_module_compression' from source: unknown 15621 1726882616.84761: variable 'ansible_shell_type' from source: unknown 15621 1726882616.84763: variable 'ansible_shell_executable' from source: unknown 15621 1726882616.84765: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882616.84771: variable 'ansible_pipelining' from source: unknown 15621 1726882616.84782: variable 'ansible_timeout' from source: unknown 15621 1726882616.84790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882616.85017: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882616.85037: variable 'omit' from source: magic vars 15621 1726882616.85047: starting attempt loop 15621 1726882616.85054: running the handler 15621 1726882616.85075: _low_level_execute_command(): starting 15621 1726882616.85088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882616.85863: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882616.85884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882616.85941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.86014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882616.86039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.86064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.86196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.87945: stdout chunk (state=3): >>>/root <<< 15621 1726882616.88131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882616.88164: stdout chunk (state=3): >>><<< 15621 1726882616.88167: stderr chunk (state=3): >>><<< 15621 1726882616.88187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882616.88287: _low_level_execute_command(): starting 15621 1726882616.88292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982 `" && echo ansible-tmp-1726882616.8819292-17433-260800961197982="` echo /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982 `" ) && sleep 0' 15621 1726882616.88943: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.88989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882616.89007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.89029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.89152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.91117: stdout chunk (state=3): >>>ansible-tmp-1726882616.8819292-17433-260800961197982=/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982 <<< 15621 1726882616.91305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882616.91317: stdout chunk (state=3): >>><<< 15621 1726882616.91526: stderr chunk (state=3): >>><<< 15621 1726882616.91530: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882616.8819292-17433-260800961197982=/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882616.91533: variable 'ansible_module_compression' from source: unknown 15621 1726882616.91535: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15621 1726882616.91538: variable 'ansible_facts' from source: unknown 15621 1726882616.91589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py 15621 1726882616.91743: Sending initial data 15621 1726882616.91753: Sent initial data (162 bytes) 15621 1726882616.92433: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882616.92450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882616.92468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882616.92542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.92569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882616.92591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.92612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.92733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.94339: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882616.94438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882616.94547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpjf7g3iya /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py <<< 15621 1726882616.94557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py" <<< 15621 1726882616.94633: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpjf7g3iya" to remote "/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py" <<< 15621 1726882616.95621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882616.95658: stderr chunk (state=3): >>><<< 15621 1726882616.95671: stdout chunk (state=3): >>><<< 15621 1726882616.95788: done transferring module to remote 15621 1726882616.95792: _low_level_execute_command(): starting 15621 1726882616.95795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/ /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py && sleep 0' 15621 1726882616.96381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.96402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.96518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882616.98422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882616.98447: stdout chunk (state=3): >>><<< 15621 1726882616.98460: stderr chunk (state=3): >>><<< 15621 1726882616.98483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882616.98528: _low_level_execute_command(): starting 15621 1726882616.98532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/AnsiballZ_service_facts.py && sleep 0' 15621 1726882616.99140: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882616.99155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882616.99181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882616.99201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882616.99220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882616.99238: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882616.99253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.99283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882616.99339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882616.99391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882616.99416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882616.99434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882616.99548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.14817: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 15621 1726882619.14838: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind<<< 15621 1726882619.14862: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-upd<<< 15621 1726882619.14893: stdout chunk (state=3): >>>ate-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inacti<<< 15621 1726882619.14901: stdout chunk (state=3): >>>ve", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15621 1726882619.16534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882619.16591: stderr chunk (state=3): >>><<< 15621 1726882619.16595: stdout chunk (state=3): >>><<< 15621 1726882619.16630: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882619.17163: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882619.17171: _low_level_execute_command(): starting 15621 1726882619.17177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882616.8819292-17433-260800961197982/ > /dev/null 2>&1 && sleep 0' 15621 1726882619.17652: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.17655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882619.17658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882619.17661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.17663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.17718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882619.17726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.17807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.19763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882619.19812: stderr chunk (state=3): >>><<< 15621 1726882619.19816: stdout chunk (state=3): >>><<< 15621 1726882619.19831: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882619.19838: handler run complete 15621 1726882619.19982: variable 'ansible_facts' from source: unknown 15621 1726882619.20112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882619.20475: variable 'ansible_facts' from source: unknown 15621 1726882619.20575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882619.20735: attempt loop complete, returning result 15621 1726882619.20738: _execute() done 15621 1726882619.20741: dumping result to json 15621 1726882619.20788: done dumping result, returning 15621 1726882619.20796: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-af1a-5b92-00000000045d] 15621 1726882619.20805: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882619.21528: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045d 15621 1726882619.21532: WORKER PROCESS EXITING 15621 1726882619.21541: no more pending results, returning what we have 15621 1726882619.21549: results queue empty 15621 1726882619.21549: checking for any_errors_fatal 15621 1726882619.21552: done checking for any_errors_fatal 15621 1726882619.21553: checking for max_fail_percentage 15621 1726882619.21554: done checking for max_fail_percentage 15621 1726882619.21554: checking to see if all hosts have failed and the running result is not ok 15621 1726882619.21555: done checking to see if all hosts have failed 15621 1726882619.21556: getting the remaining hosts for this loop 15621 1726882619.21557: done getting the remaining hosts for this loop 15621 1726882619.21559: getting the next task for host managed_node3 15621 1726882619.21564: done getting next task for host managed_node3 15621 1726882619.21566: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882619.21568: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882619.21576: getting variables 15621 1726882619.21577: in VariableManager get_vars() 15621 1726882619.21603: Calling all_inventory to load vars for managed_node3 15621 1726882619.21604: Calling groups_inventory to load vars for managed_node3 15621 1726882619.21606: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882619.21613: Calling all_plugins_play to load vars for managed_node3 15621 1726882619.21615: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882619.21617: Calling groups_plugins_play to load vars for managed_node3 15621 1726882619.22614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882619.23783: done with get_vars() 15621 1726882619.23802: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:59 -0400 (0:00:02.411) 0:00:51.318 ****** 15621 1726882619.23885: entering _queue_task() for managed_node3/package_facts 15621 1726882619.24156: worker is 1 (out of 1 available) 15621 1726882619.24172: exiting _queue_task() for managed_node3/package_facts 15621 1726882619.24186: done queuing things up, now waiting for results queue to drain 15621 1726882619.24188: waiting for pending results... 15621 1726882619.24378: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15621 1726882619.24468: in run() - task 0affc7ec-ae25-af1a-5b92-00000000045e 15621 1726882619.24485: variable 'ansible_search_path' from source: unknown 15621 1726882619.24489: variable 'ansible_search_path' from source: unknown 15621 1726882619.24526: calling self._execute() 15621 1726882619.24599: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882619.24603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882619.24613: variable 'omit' from source: magic vars 15621 1726882619.24911: variable 'ansible_distribution_major_version' from source: facts 15621 1726882619.24923: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882619.24929: variable 'omit' from source: magic vars 15621 1726882619.24976: variable 'omit' from source: magic vars 15621 1726882619.25003: variable 'omit' from source: magic vars 15621 1726882619.25038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882619.25068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882619.25090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882619.25106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882619.25116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882619.25143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882619.25147: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882619.25149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882619.25229: Set connection var ansible_connection to ssh 15621 1726882619.25238: Set connection var ansible_shell_executable to /bin/sh 15621 1726882619.25243: Set connection var ansible_timeout to 10 15621 1726882619.25246: Set connection var ansible_shell_type to sh 15621 1726882619.25251: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882619.25256: Set connection var ansible_pipelining to False 15621 1726882619.25281: variable 'ansible_shell_executable' from source: unknown 15621 1726882619.25284: variable 'ansible_connection' from source: unknown 15621 1726882619.25289: variable 'ansible_module_compression' from source: unknown 15621 1726882619.25292: variable 'ansible_shell_type' from source: unknown 15621 1726882619.25294: variable 'ansible_shell_executable' from source: unknown 15621 1726882619.25296: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882619.25299: variable 'ansible_pipelining' from source: unknown 15621 1726882619.25301: variable 'ansible_timeout' from source: unknown 15621 1726882619.25303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882619.25467: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882619.25475: variable 'omit' from source: magic vars 15621 1726882619.25482: starting attempt loop 15621 1726882619.25485: running the handler 15621 1726882619.25498: _low_level_execute_command(): starting 15621 1726882619.25509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882619.26065: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.26068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.26072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.26077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.26134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882619.26138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882619.26149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.26231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.27991: stdout chunk (state=3): >>>/root <<< 15621 1726882619.28102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882619.28158: stderr chunk (state=3): >>><<< 15621 1726882619.28162: stdout chunk (state=3): >>><<< 15621 1726882619.28187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882619.28197: _low_level_execute_command(): starting 15621 1726882619.28203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839 `" && echo ansible-tmp-1726882619.2818394-17497-246454659267839="` echo /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839 `" ) && sleep 0' 15621 1726882619.28683: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.28686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.28696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.28698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882619.28701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.28748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882619.28755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.28844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.30839: stdout chunk (state=3): >>>ansible-tmp-1726882619.2818394-17497-246454659267839=/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839 <<< 15621 1726882619.30957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882619.31010: stderr chunk (state=3): >>><<< 15621 1726882619.31013: stdout chunk (state=3): >>><<< 15621 1726882619.31030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882619.2818394-17497-246454659267839=/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882619.31070: variable 'ansible_module_compression' from source: unknown 15621 1726882619.31110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15621 1726882619.31164: variable 'ansible_facts' from source: unknown 15621 1726882619.31283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py 15621 1726882619.31401: Sending initial data 15621 1726882619.31405: Sent initial data (162 bytes) 15621 1726882619.31890: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.31894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.31896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882619.31899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.31901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.31946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882619.31951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.32044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.33690: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882619.33775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882619.33860: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpa7yt494e /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py <<< 15621 1726882619.33862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py" <<< 15621 1726882619.33942: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpa7yt494e" to remote "/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py" <<< 15621 1726882619.35464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882619.35536: stderr chunk (state=3): >>><<< 15621 1726882619.35540: stdout chunk (state=3): >>><<< 15621 1726882619.35561: done transferring module to remote 15621 1726882619.35573: _low_level_execute_command(): starting 15621 1726882619.35580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/ /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py && sleep 0' 15621 1726882619.36059: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.36062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882619.36065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882619.36067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.36076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.36127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882619.36132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.36217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882619.38231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882619.38235: stdout chunk (state=3): >>><<< 15621 1726882619.38237: stderr chunk (state=3): >>><<< 15621 1726882619.38240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882619.38242: _low_level_execute_command(): starting 15621 1726882619.38245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/AnsiballZ_package_facts.py && sleep 0' 15621 1726882619.38857: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882619.38872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882619.38885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882619.38929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882619.39033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882619.39057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882619.39176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882620.01744: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 15621 1726882620.01875: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3<<< 15621 1726882620.01892: stdout chunk (state=3): >>>-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "<<< 15621 1726882620.01981: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "sour<<< 15621 1726882620.02020: stdout chunk (state=3): >>>ce": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15621 1726882620.03953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882620.04146: stderr chunk (state=3): >>><<< 15621 1726882620.04327: stdout chunk (state=3): >>><<< 15621 1726882620.04603: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882620.09312: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882620.09360: _low_level_execute_command(): starting 15621 1726882620.09364: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882619.2818394-17497-246454659267839/ > /dev/null 2>&1 && sleep 0' 15621 1726882620.10104: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882620.10111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882620.10127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882620.10143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882620.10156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882620.10193: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882620.10196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882620.10199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882620.10202: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address <<< 15621 1726882620.10204: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15621 1726882620.10232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882620.10290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882620.10323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882620.10331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882620.10444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882620.12601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882620.12604: stdout chunk (state=3): >>><<< 15621 1726882620.12693: stderr chunk (state=3): >>><<< 15621 1726882620.12695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882620.12698: handler run complete 15621 1726882620.14730: variable 'ansible_facts' from source: unknown 15621 1726882620.15620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.19036: variable 'ansible_facts' from source: unknown 15621 1726882620.19631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.21399: attempt loop complete, returning result 15621 1726882620.21410: _execute() done 15621 1726882620.21415: dumping result to json 15621 1726882620.22095: done dumping result, returning 15621 1726882620.22138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-af1a-5b92-00000000045e] 15621 1726882620.22156: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045e 15621 1726882620.26896: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000045e 15621 1726882620.26900: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882620.27064: no more pending results, returning what we have 15621 1726882620.27068: results queue empty 15621 1726882620.27069: checking for any_errors_fatal 15621 1726882620.27076: done checking for any_errors_fatal 15621 1726882620.27076: checking for max_fail_percentage 15621 1726882620.27078: done checking for max_fail_percentage 15621 1726882620.27079: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.27080: done checking to see if all hosts have failed 15621 1726882620.27080: getting the remaining hosts for this loop 15621 1726882620.27081: done getting the remaining hosts for this loop 15621 1726882620.27092: getting the next task for host managed_node3 15621 1726882620.27099: done getting next task for host managed_node3 15621 1726882620.27103: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882620.27105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.27116: getting variables 15621 1726882620.27117: in VariableManager get_vars() 15621 1726882620.27153: Calling all_inventory to load vars for managed_node3 15621 1726882620.27156: Calling groups_inventory to load vars for managed_node3 15621 1726882620.27159: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.27177: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.27181: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.27186: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.29187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.32521: done with get_vars() 15621 1726882620.32560: done getting variables 15621 1726882620.32749: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:00 -0400 (0:00:01.089) 0:00:52.408 ****** 15621 1726882620.32906: entering _queue_task() for managed_node3/debug 15621 1726882620.33714: worker is 1 (out of 1 available) 15621 1726882620.33732: exiting _queue_task() for managed_node3/debug 15621 1726882620.33749: done queuing things up, now waiting for results queue to drain 15621 1726882620.33751: waiting for pending results... 15621 1726882620.34546: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15621 1726882620.34554: in run() - task 0affc7ec-ae25-af1a-5b92-00000000005d 15621 1726882620.34560: variable 'ansible_search_path' from source: unknown 15621 1726882620.34563: variable 'ansible_search_path' from source: unknown 15621 1726882620.34566: calling self._execute() 15621 1726882620.34571: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.34575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.34697: variable 'omit' from source: magic vars 15621 1726882620.35076: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.35088: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.35094: variable 'omit' from source: magic vars 15621 1726882620.35179: variable 'omit' from source: magic vars 15621 1726882620.35436: variable 'network_provider' from source: set_fact 15621 1726882620.35470: variable 'omit' from source: magic vars 15621 1726882620.35567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882620.35578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882620.35592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882620.35612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882620.35625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882620.35657: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882620.35661: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.35664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.35796: Set connection var ansible_connection to ssh 15621 1726882620.35805: Set connection var ansible_shell_executable to /bin/sh 15621 1726882620.35808: Set connection var ansible_timeout to 10 15621 1726882620.35811: Set connection var ansible_shell_type to sh 15621 1726882620.35820: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882620.36013: Set connection var ansible_pipelining to False 15621 1726882620.36032: variable 'ansible_shell_executable' from source: unknown 15621 1726882620.36036: variable 'ansible_connection' from source: unknown 15621 1726882620.36042: variable 'ansible_module_compression' from source: unknown 15621 1726882620.36047: variable 'ansible_shell_type' from source: unknown 15621 1726882620.36050: variable 'ansible_shell_executable' from source: unknown 15621 1726882620.36052: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.36057: variable 'ansible_pipelining' from source: unknown 15621 1726882620.36060: variable 'ansible_timeout' from source: unknown 15621 1726882620.36065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.36284: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882620.36297: variable 'omit' from source: magic vars 15621 1726882620.36303: starting attempt loop 15621 1726882620.36306: running the handler 15621 1726882620.36357: handler run complete 15621 1726882620.36442: attempt loop complete, returning result 15621 1726882620.36446: _execute() done 15621 1726882620.36450: dumping result to json 15621 1726882620.36452: done dumping result, returning 15621 1726882620.36454: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-af1a-5b92-00000000005d] 15621 1726882620.36457: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005d ok: [managed_node3] => {} MSG: Using network provider: nm 15621 1726882620.36701: no more pending results, returning what we have 15621 1726882620.36704: results queue empty 15621 1726882620.36707: checking for any_errors_fatal 15621 1726882620.36713: done checking for any_errors_fatal 15621 1726882620.36714: checking for max_fail_percentage 15621 1726882620.36716: done checking for max_fail_percentage 15621 1726882620.36716: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.36718: done checking to see if all hosts have failed 15621 1726882620.36718: getting the remaining hosts for this loop 15621 1726882620.36720: done getting the remaining hosts for this loop 15621 1726882620.36725: getting the next task for host managed_node3 15621 1726882620.36730: done getting next task for host managed_node3 15621 1726882620.36734: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882620.36736: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.36745: getting variables 15621 1726882620.36746: in VariableManager get_vars() 15621 1726882620.36781: Calling all_inventory to load vars for managed_node3 15621 1726882620.36784: Calling groups_inventory to load vars for managed_node3 15621 1726882620.36786: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.36795: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.36798: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.36800: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.37348: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005d 15621 1726882620.37352: WORKER PROCESS EXITING 15621 1726882620.38744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.41303: done with get_vars() 15621 1726882620.41335: done getting variables 15621 1726882620.41405: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:00 -0400 (0:00:00.085) 0:00:52.493 ****** 15621 1726882620.41443: entering _queue_task() for managed_node3/fail 15621 1726882620.41801: worker is 1 (out of 1 available) 15621 1726882620.41821: exiting _queue_task() for managed_node3/fail 15621 1726882620.41840: done queuing things up, now waiting for results queue to drain 15621 1726882620.41842: waiting for pending results... 15621 1726882620.42163: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15621 1726882620.42317: in run() - task 0affc7ec-ae25-af1a-5b92-00000000005e 15621 1726882620.42370: variable 'ansible_search_path' from source: unknown 15621 1726882620.42401: variable 'ansible_search_path' from source: unknown 15621 1726882620.42477: calling self._execute() 15621 1726882620.42678: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.42683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.42696: variable 'omit' from source: magic vars 15621 1726882620.43265: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.43342: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.43439: variable 'network_state' from source: role '' defaults 15621 1726882620.43464: Evaluated conditional (network_state != {}): False 15621 1726882620.43475: when evaluation is False, skipping this task 15621 1726882620.43484: _execute() done 15621 1726882620.43492: dumping result to json 15621 1726882620.43501: done dumping result, returning 15621 1726882620.43530: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-af1a-5b92-00000000005e] 15621 1726882620.43654: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005e 15621 1726882620.43751: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005e 15621 1726882620.43756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882620.43816: no more pending results, returning what we have 15621 1726882620.43820: results queue empty 15621 1726882620.43823: checking for any_errors_fatal 15621 1726882620.43831: done checking for any_errors_fatal 15621 1726882620.43832: checking for max_fail_percentage 15621 1726882620.43834: done checking for max_fail_percentage 15621 1726882620.43834: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.43836: done checking to see if all hosts have failed 15621 1726882620.43836: getting the remaining hosts for this loop 15621 1726882620.43838: done getting the remaining hosts for this loop 15621 1726882620.43842: getting the next task for host managed_node3 15621 1726882620.43848: done getting next task for host managed_node3 15621 1726882620.43852: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882620.43854: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.43871: getting variables 15621 1726882620.43873: in VariableManager get_vars() 15621 1726882620.43915: Calling all_inventory to load vars for managed_node3 15621 1726882620.43917: Calling groups_inventory to load vars for managed_node3 15621 1726882620.43919: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.44165: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.44168: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.44171: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.46107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.48701: done with get_vars() 15621 1726882620.48731: done getting variables 15621 1726882620.48810: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:00 -0400 (0:00:00.074) 0:00:52.567 ****** 15621 1726882620.48849: entering _queue_task() for managed_node3/fail 15621 1726882620.49281: worker is 1 (out of 1 available) 15621 1726882620.49299: exiting _queue_task() for managed_node3/fail 15621 1726882620.49312: done queuing things up, now waiting for results queue to drain 15621 1726882620.49314: waiting for pending results... 15621 1726882620.49949: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15621 1726882620.50178: in run() - task 0affc7ec-ae25-af1a-5b92-00000000005f 15621 1726882620.50183: variable 'ansible_search_path' from source: unknown 15621 1726882620.50188: variable 'ansible_search_path' from source: unknown 15621 1726882620.50266: calling self._execute() 15621 1726882620.50410: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.50432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.50450: variable 'omit' from source: magic vars 15621 1726882620.50982: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.51032: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.51186: variable 'network_state' from source: role '' defaults 15621 1726882620.51243: Evaluated conditional (network_state != {}): False 15621 1726882620.51247: when evaluation is False, skipping this task 15621 1726882620.51250: _execute() done 15621 1726882620.51253: dumping result to json 15621 1726882620.51262: done dumping result, returning 15621 1726882620.51267: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-af1a-5b92-00000000005f] 15621 1726882620.51279: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882620.51647: no more pending results, returning what we have 15621 1726882620.51655: results queue empty 15621 1726882620.51656: checking for any_errors_fatal 15621 1726882620.51669: done checking for any_errors_fatal 15621 1726882620.51670: checking for max_fail_percentage 15621 1726882620.51671: done checking for max_fail_percentage 15621 1726882620.51674: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.51676: done checking to see if all hosts have failed 15621 1726882620.51677: getting the remaining hosts for this loop 15621 1726882620.51679: done getting the remaining hosts for this loop 15621 1726882620.51692: getting the next task for host managed_node3 15621 1726882620.51699: done getting next task for host managed_node3 15621 1726882620.51704: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882620.51707: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.51732: getting variables 15621 1726882620.51734: in VariableManager get_vars() 15621 1726882620.51788: Calling all_inventory to load vars for managed_node3 15621 1726882620.51911: Calling groups_inventory to load vars for managed_node3 15621 1726882620.51915: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.52033: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.52037: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.52041: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.60843: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000005f 15621 1726882620.60848: WORKER PROCESS EXITING 15621 1726882620.62148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.64458: done with get_vars() 15621 1726882620.64492: done getting variables 15621 1726882620.64562: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:00 -0400 (0:00:00.157) 0:00:52.725 ****** 15621 1726882620.64590: entering _queue_task() for managed_node3/fail 15621 1726882620.65019: worker is 1 (out of 1 available) 15621 1726882620.65036: exiting _queue_task() for managed_node3/fail 15621 1726882620.65057: done queuing things up, now waiting for results queue to drain 15621 1726882620.65059: waiting for pending results... 15621 1726882620.65565: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15621 1726882620.65717: in run() - task 0affc7ec-ae25-af1a-5b92-000000000060 15621 1726882620.65738: variable 'ansible_search_path' from source: unknown 15621 1726882620.65771: variable 'ansible_search_path' from source: unknown 15621 1726882620.65819: calling self._execute() 15621 1726882620.65956: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.65963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.65975: variable 'omit' from source: magic vars 15621 1726882620.66497: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.66510: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.66715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882620.70114: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882620.70408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882620.70481: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882620.70505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882620.70589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882620.70680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.70714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.70769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.70878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.70890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.70990: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.71006: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15621 1726882620.71135: variable 'ansible_distribution' from source: facts 15621 1726882620.71227: variable '__network_rh_distros' from source: role '' defaults 15621 1726882620.71232: Evaluated conditional (ansible_distribution in __network_rh_distros): False 15621 1726882620.71236: when evaluation is False, skipping this task 15621 1726882620.71238: _execute() done 15621 1726882620.71241: dumping result to json 15621 1726882620.71244: done dumping result, returning 15621 1726882620.71248: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-af1a-5b92-000000000060] 15621 1726882620.71251: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000060 15621 1726882620.71326: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000060 15621 1726882620.71329: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 15621 1726882620.71387: no more pending results, returning what we have 15621 1726882620.71391: results queue empty 15621 1726882620.71392: checking for any_errors_fatal 15621 1726882620.71401: done checking for any_errors_fatal 15621 1726882620.71402: checking for max_fail_percentage 15621 1726882620.71403: done checking for max_fail_percentage 15621 1726882620.71404: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.71405: done checking to see if all hosts have failed 15621 1726882620.71406: getting the remaining hosts for this loop 15621 1726882620.71407: done getting the remaining hosts for this loop 15621 1726882620.71411: getting the next task for host managed_node3 15621 1726882620.71417: done getting next task for host managed_node3 15621 1726882620.71421: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882620.71425: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.71625: getting variables 15621 1726882620.71628: in VariableManager get_vars() 15621 1726882620.71669: Calling all_inventory to load vars for managed_node3 15621 1726882620.71672: Calling groups_inventory to load vars for managed_node3 15621 1726882620.71677: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.71689: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.71693: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.71696: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.73846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.76166: done with get_vars() 15621 1726882620.76210: done getting variables 15621 1726882620.76284: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:00 -0400 (0:00:00.117) 0:00:52.842 ****** 15621 1726882620.76330: entering _queue_task() for managed_node3/dnf 15621 1726882620.76802: worker is 1 (out of 1 available) 15621 1726882620.76816: exiting _queue_task() for managed_node3/dnf 15621 1726882620.76833: done queuing things up, now waiting for results queue to drain 15621 1726882620.76835: waiting for pending results... 15621 1726882620.77149: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15621 1726882620.77260: in run() - task 0affc7ec-ae25-af1a-5b92-000000000061 15621 1726882620.77286: variable 'ansible_search_path' from source: unknown 15621 1726882620.77359: variable 'ansible_search_path' from source: unknown 15621 1726882620.77365: calling self._execute() 15621 1726882620.77493: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.77509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.77529: variable 'omit' from source: magic vars 15621 1726882620.78037: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.78058: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.78351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882620.81600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882620.81690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882620.81732: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882620.81775: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882620.81800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882620.81891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.81923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.81954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.82066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.82075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.82153: variable 'ansible_distribution' from source: facts 15621 1726882620.82157: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.82165: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15621 1726882620.82301: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882620.82455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.82480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.82506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.82555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.82608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.82612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.82644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.82671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.82717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.82729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.82780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.82826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.82836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.82879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.82929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.83081: variable 'network_connections' from source: play vars 15621 1726882620.83093: variable 'profile' from source: play vars 15621 1726882620.83170: variable 'profile' from source: play vars 15621 1726882620.83176: variable 'interface' from source: set_fact 15621 1726882620.83258: variable 'interface' from source: set_fact 15621 1726882620.83318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882620.83497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882620.83588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882620.83594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882620.83601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882620.83647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882620.83670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882620.83697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.83720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882620.83777: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882620.84012: variable 'network_connections' from source: play vars 15621 1726882620.84021: variable 'profile' from source: play vars 15621 1726882620.84086: variable 'profile' from source: play vars 15621 1726882620.84090: variable 'interface' from source: set_fact 15621 1726882620.84152: variable 'interface' from source: set_fact 15621 1726882620.84241: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882620.84247: when evaluation is False, skipping this task 15621 1726882620.84253: _execute() done 15621 1726882620.84255: dumping result to json 15621 1726882620.84258: done dumping result, returning 15621 1726882620.84260: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000061] 15621 1726882620.84262: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000061 15621 1726882620.84335: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000061 15621 1726882620.84338: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882620.84413: no more pending results, returning what we have 15621 1726882620.84417: results queue empty 15621 1726882620.84418: checking for any_errors_fatal 15621 1726882620.84428: done checking for any_errors_fatal 15621 1726882620.84429: checking for max_fail_percentage 15621 1726882620.84430: done checking for max_fail_percentage 15621 1726882620.84431: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.84432: done checking to see if all hosts have failed 15621 1726882620.84432: getting the remaining hosts for this loop 15621 1726882620.84434: done getting the remaining hosts for this loop 15621 1726882620.84438: getting the next task for host managed_node3 15621 1726882620.84443: done getting next task for host managed_node3 15621 1726882620.84447: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882620.84448: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.84538: getting variables 15621 1726882620.84540: in VariableManager get_vars() 15621 1726882620.84584: Calling all_inventory to load vars for managed_node3 15621 1726882620.84587: Calling groups_inventory to load vars for managed_node3 15621 1726882620.84589: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.84599: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.84602: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.84605: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.86329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.88827: done with get_vars() 15621 1726882620.88855: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15621 1726882620.88949: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:00 -0400 (0:00:00.126) 0:00:52.969 ****** 15621 1726882620.88987: entering _queue_task() for managed_node3/yum 15621 1726882620.89381: worker is 1 (out of 1 available) 15621 1726882620.89396: exiting _queue_task() for managed_node3/yum 15621 1726882620.89408: done queuing things up, now waiting for results queue to drain 15621 1726882620.89410: waiting for pending results... 15621 1726882620.89847: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15621 1726882620.89860: in run() - task 0affc7ec-ae25-af1a-5b92-000000000062 15621 1726882620.89887: variable 'ansible_search_path' from source: unknown 15621 1726882620.89895: variable 'ansible_search_path' from source: unknown 15621 1726882620.89945: calling self._execute() 15621 1726882620.90078: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882620.90091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882620.90104: variable 'omit' from source: magic vars 15621 1726882620.90590: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.90593: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882620.90747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882620.93660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882620.93780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882620.93847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882620.93898: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882620.93932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882620.94033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882620.94229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882620.94233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882620.94237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882620.94240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882620.94314: variable 'ansible_distribution_major_version' from source: facts 15621 1726882620.94338: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15621 1726882620.94347: when evaluation is False, skipping this task 15621 1726882620.94365: _execute() done 15621 1726882620.94375: dumping result to json 15621 1726882620.94385: done dumping result, returning 15621 1726882620.94399: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000062] 15621 1726882620.94410: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000062 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15621 1726882620.94706: no more pending results, returning what we have 15621 1726882620.94710: results queue empty 15621 1726882620.94711: checking for any_errors_fatal 15621 1726882620.94720: done checking for any_errors_fatal 15621 1726882620.94721: checking for max_fail_percentage 15621 1726882620.94729: done checking for max_fail_percentage 15621 1726882620.94730: checking to see if all hosts have failed and the running result is not ok 15621 1726882620.94731: done checking to see if all hosts have failed 15621 1726882620.94732: getting the remaining hosts for this loop 15621 1726882620.94734: done getting the remaining hosts for this loop 15621 1726882620.94739: getting the next task for host managed_node3 15621 1726882620.94746: done getting next task for host managed_node3 15621 1726882620.94751: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882620.94753: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882620.94770: getting variables 15621 1726882620.94775: in VariableManager get_vars() 15621 1726882620.94821: Calling all_inventory to load vars for managed_node3 15621 1726882620.94931: Calling groups_inventory to load vars for managed_node3 15621 1726882620.94934: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882620.95040: Calling all_plugins_play to load vars for managed_node3 15621 1726882620.95044: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882620.95048: Calling groups_plugins_play to load vars for managed_node3 15621 1726882620.95656: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000062 15621 1726882620.95660: WORKER PROCESS EXITING 15621 1726882620.96957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882620.99410: done with get_vars() 15621 1726882620.99452: done getting variables 15621 1726882620.99543: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:00 -0400 (0:00:00.105) 0:00:53.075 ****** 15621 1726882620.99592: entering _queue_task() for managed_node3/fail 15621 1726882620.99989: worker is 1 (out of 1 available) 15621 1726882621.00003: exiting _queue_task() for managed_node3/fail 15621 1726882621.00017: done queuing things up, now waiting for results queue to drain 15621 1726882621.00018: waiting for pending results... 15621 1726882621.00353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15621 1726882621.00502: in run() - task 0affc7ec-ae25-af1a-5b92-000000000063 15621 1726882621.00528: variable 'ansible_search_path' from source: unknown 15621 1726882621.00537: variable 'ansible_search_path' from source: unknown 15621 1726882621.00591: calling self._execute() 15621 1726882621.00698: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.00728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.00732: variable 'omit' from source: magic vars 15621 1726882621.01205: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.01303: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.01397: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.01716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882621.04386: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882621.04728: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882621.04758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882621.04788: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882621.04809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882621.04878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.04903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.04925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.04956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.04970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.05008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.05027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.05046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.05080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.05092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.05124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.05141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.05160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.05193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.05204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.05327: variable 'network_connections' from source: play vars 15621 1726882621.05337: variable 'profile' from source: play vars 15621 1726882621.05399: variable 'profile' from source: play vars 15621 1726882621.05402: variable 'interface' from source: set_fact 15621 1726882621.05446: variable 'interface' from source: set_fact 15621 1726882621.05503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882621.05621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882621.05651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882621.05673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882621.05708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882621.05747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882621.05764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882621.05784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.05803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882621.05847: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882621.06010: variable 'network_connections' from source: play vars 15621 1726882621.06014: variable 'profile' from source: play vars 15621 1726882621.06080: variable 'profile' from source: play vars 15621 1726882621.06083: variable 'interface' from source: set_fact 15621 1726882621.06127: variable 'interface' from source: set_fact 15621 1726882621.06151: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882621.06156: when evaluation is False, skipping this task 15621 1726882621.06161: _execute() done 15621 1726882621.06164: dumping result to json 15621 1726882621.06166: done dumping result, returning 15621 1726882621.06294: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000063] 15621 1726882621.06306: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000063 15621 1726882621.06378: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000063 15621 1726882621.06381: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882621.06443: no more pending results, returning what we have 15621 1726882621.06446: results queue empty 15621 1726882621.06447: checking for any_errors_fatal 15621 1726882621.06452: done checking for any_errors_fatal 15621 1726882621.06452: checking for max_fail_percentage 15621 1726882621.06454: done checking for max_fail_percentage 15621 1726882621.06455: checking to see if all hosts have failed and the running result is not ok 15621 1726882621.06456: done checking to see if all hosts have failed 15621 1726882621.06456: getting the remaining hosts for this loop 15621 1726882621.06458: done getting the remaining hosts for this loop 15621 1726882621.06461: getting the next task for host managed_node3 15621 1726882621.06466: done getting next task for host managed_node3 15621 1726882621.06470: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15621 1726882621.06472: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882621.06485: getting variables 15621 1726882621.06487: in VariableManager get_vars() 15621 1726882621.06527: Calling all_inventory to load vars for managed_node3 15621 1726882621.06530: Calling groups_inventory to load vars for managed_node3 15621 1726882621.06532: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882621.06541: Calling all_plugins_play to load vars for managed_node3 15621 1726882621.06544: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882621.06546: Calling groups_plugins_play to load vars for managed_node3 15621 1726882621.08199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882621.09367: done with get_vars() 15621 1726882621.09386: done getting variables 15621 1726882621.09439: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:01 -0400 (0:00:00.098) 0:00:53.173 ****** 15621 1726882621.09462: entering _queue_task() for managed_node3/package 15621 1726882621.09764: worker is 1 (out of 1 available) 15621 1726882621.09777: exiting _queue_task() for managed_node3/package 15621 1726882621.09789: done queuing things up, now waiting for results queue to drain 15621 1726882621.09791: waiting for pending results... 15621 1726882621.10250: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15621 1726882621.10261: in run() - task 0affc7ec-ae25-af1a-5b92-000000000064 15621 1726882621.10266: variable 'ansible_search_path' from source: unknown 15621 1726882621.10270: variable 'ansible_search_path' from source: unknown 15621 1726882621.10279: calling self._execute() 15621 1726882621.10389: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.10393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.10397: variable 'omit' from source: magic vars 15621 1726882621.10789: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.10824: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.11011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882621.11294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882621.11366: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882621.11379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882621.11454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882621.11583: variable 'network_packages' from source: role '' defaults 15621 1726882621.11691: variable '__network_provider_setup' from source: role '' defaults 15621 1726882621.11758: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882621.11779: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882621.11786: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882621.11852: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882621.12056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882621.14262: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882621.14428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882621.14432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882621.14435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882621.14438: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882621.14527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.14559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.14589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.14635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.14650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.14703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.14730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.14757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.14887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.14891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.15067: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882621.15187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.15213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.15244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.15285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.15300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.15399: variable 'ansible_python' from source: facts 15621 1726882621.15431: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882621.15517: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882621.15630: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882621.15745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.15768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.15800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.15868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.15879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.15911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.15982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.15985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.16007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.16028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.16196: variable 'network_connections' from source: play vars 15621 1726882621.16199: variable 'profile' from source: play vars 15621 1726882621.16289: variable 'profile' from source: play vars 15621 1726882621.16317: variable 'interface' from source: set_fact 15621 1726882621.16377: variable 'interface' from source: set_fact 15621 1726882621.16530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882621.16534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882621.16537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.16540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882621.16589: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.16906: variable 'network_connections' from source: play vars 15621 1726882621.16909: variable 'profile' from source: play vars 15621 1726882621.17016: variable 'profile' from source: play vars 15621 1726882621.17024: variable 'interface' from source: set_fact 15621 1726882621.17201: variable 'interface' from source: set_fact 15621 1726882621.17206: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882621.17214: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.17566: variable 'network_connections' from source: play vars 15621 1726882621.17569: variable 'profile' from source: play vars 15621 1726882621.17643: variable 'profile' from source: play vars 15621 1726882621.17648: variable 'interface' from source: set_fact 15621 1726882621.17764: variable 'interface' from source: set_fact 15621 1726882621.17783: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882621.17877: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882621.18208: variable 'network_connections' from source: play vars 15621 1726882621.18212: variable 'profile' from source: play vars 15621 1726882621.18282: variable 'profile' from source: play vars 15621 1726882621.18287: variable 'interface' from source: set_fact 15621 1726882621.18395: variable 'interface' from source: set_fact 15621 1726882621.18462: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882621.18527: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882621.18539: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882621.18601: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882621.18855: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882621.19397: variable 'network_connections' from source: play vars 15621 1726882621.19401: variable 'profile' from source: play vars 15621 1726882621.19528: variable 'profile' from source: play vars 15621 1726882621.19532: variable 'interface' from source: set_fact 15621 1726882621.19538: variable 'interface' from source: set_fact 15621 1726882621.19548: variable 'ansible_distribution' from source: facts 15621 1726882621.19551: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.19557: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.19572: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882621.19769: variable 'ansible_distribution' from source: facts 15621 1726882621.19775: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.19779: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.19781: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882621.19986: variable 'ansible_distribution' from source: facts 15621 1726882621.19991: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.19993: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.20031: variable 'network_provider' from source: set_fact 15621 1726882621.20034: variable 'ansible_facts' from source: unknown 15621 1726882621.20955: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15621 1726882621.20959: when evaluation is False, skipping this task 15621 1726882621.20963: _execute() done 15621 1726882621.20965: dumping result to json 15621 1726882621.20968: done dumping result, returning 15621 1726882621.21027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-af1a-5b92-000000000064] 15621 1726882621.21030: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000064 15621 1726882621.21109: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000064 15621 1726882621.21113: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15621 1726882621.21171: no more pending results, returning what we have 15621 1726882621.21174: results queue empty 15621 1726882621.21176: checking for any_errors_fatal 15621 1726882621.21186: done checking for any_errors_fatal 15621 1726882621.21187: checking for max_fail_percentage 15621 1726882621.21189: done checking for max_fail_percentage 15621 1726882621.21189: checking to see if all hosts have failed and the running result is not ok 15621 1726882621.21191: done checking to see if all hosts have failed 15621 1726882621.21191: getting the remaining hosts for this loop 15621 1726882621.21193: done getting the remaining hosts for this loop 15621 1726882621.21198: getting the next task for host managed_node3 15621 1726882621.21205: done getting next task for host managed_node3 15621 1726882621.21209: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882621.21211: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882621.21230: getting variables 15621 1726882621.21232: in VariableManager get_vars() 15621 1726882621.21275: Calling all_inventory to load vars for managed_node3 15621 1726882621.21278: Calling groups_inventory to load vars for managed_node3 15621 1726882621.21281: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882621.21300: Calling all_plugins_play to load vars for managed_node3 15621 1726882621.21303: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882621.21306: Calling groups_plugins_play to load vars for managed_node3 15621 1726882621.23255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882621.25442: done with get_vars() 15621 1726882621.25467: done getting variables 15621 1726882621.25535: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:01 -0400 (0:00:00.161) 0:00:53.334 ****** 15621 1726882621.25568: entering _queue_task() for managed_node3/package 15621 1726882621.25919: worker is 1 (out of 1 available) 15621 1726882621.25935: exiting _queue_task() for managed_node3/package 15621 1726882621.25947: done queuing things up, now waiting for results queue to drain 15621 1726882621.25949: waiting for pending results... 15621 1726882621.26311: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15621 1726882621.26407: in run() - task 0affc7ec-ae25-af1a-5b92-000000000065 15621 1726882621.26411: variable 'ansible_search_path' from source: unknown 15621 1726882621.26415: variable 'ansible_search_path' from source: unknown 15621 1726882621.26418: calling self._execute() 15621 1726882621.26502: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.26513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.26517: variable 'omit' from source: magic vars 15621 1726882621.27127: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.27131: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.27134: variable 'network_state' from source: role '' defaults 15621 1726882621.27136: Evaluated conditional (network_state != {}): False 15621 1726882621.27139: when evaluation is False, skipping this task 15621 1726882621.27141: _execute() done 15621 1726882621.27144: dumping result to json 15621 1726882621.27146: done dumping result, returning 15621 1726882621.27148: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000065] 15621 1726882621.27152: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000065 15621 1726882621.27223: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000065 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882621.27277: no more pending results, returning what we have 15621 1726882621.27281: results queue empty 15621 1726882621.27283: checking for any_errors_fatal 15621 1726882621.27289: done checking for any_errors_fatal 15621 1726882621.27290: checking for max_fail_percentage 15621 1726882621.27291: done checking for max_fail_percentage 15621 1726882621.27292: checking to see if all hosts have failed and the running result is not ok 15621 1726882621.27293: done checking to see if all hosts have failed 15621 1726882621.27294: getting the remaining hosts for this loop 15621 1726882621.27295: done getting the remaining hosts for this loop 15621 1726882621.27300: getting the next task for host managed_node3 15621 1726882621.27306: done getting next task for host managed_node3 15621 1726882621.27310: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882621.27313: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882621.27337: getting variables 15621 1726882621.27339: in VariableManager get_vars() 15621 1726882621.27380: Calling all_inventory to load vars for managed_node3 15621 1726882621.27383: Calling groups_inventory to load vars for managed_node3 15621 1726882621.27385: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882621.27402: Calling all_plugins_play to load vars for managed_node3 15621 1726882621.27405: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882621.27410: Calling groups_plugins_play to load vars for managed_node3 15621 1726882621.27954: WORKER PROCESS EXITING 15621 1726882621.29358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882621.31800: done with get_vars() 15621 1726882621.31841: done getting variables 15621 1726882621.31920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:01 -0400 (0:00:00.063) 0:00:53.398 ****** 15621 1726882621.31958: entering _queue_task() for managed_node3/package 15621 1726882621.32531: worker is 1 (out of 1 available) 15621 1726882621.32543: exiting _queue_task() for managed_node3/package 15621 1726882621.32553: done queuing things up, now waiting for results queue to drain 15621 1726882621.32554: waiting for pending results... 15621 1726882621.32762: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15621 1726882621.32861: in run() - task 0affc7ec-ae25-af1a-5b92-000000000066 15621 1726882621.32967: variable 'ansible_search_path' from source: unknown 15621 1726882621.32971: variable 'ansible_search_path' from source: unknown 15621 1726882621.32976: calling self._execute() 15621 1726882621.33079: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.33095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.33119: variable 'omit' from source: magic vars 15621 1726882621.33590: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.33609: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.33780: variable 'network_state' from source: role '' defaults 15621 1726882621.33828: Evaluated conditional (network_state != {}): False 15621 1726882621.33833: when evaluation is False, skipping this task 15621 1726882621.33841: _execute() done 15621 1726882621.33844: dumping result to json 15621 1726882621.33846: done dumping result, returning 15621 1726882621.33850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-af1a-5b92-000000000066] 15621 1726882621.33852: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000066 15621 1726882621.34036: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000066 15621 1726882621.34039: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882621.34228: no more pending results, returning what we have 15621 1726882621.34232: results queue empty 15621 1726882621.34233: checking for any_errors_fatal 15621 1726882621.34241: done checking for any_errors_fatal 15621 1726882621.34242: checking for max_fail_percentage 15621 1726882621.34243: done checking for max_fail_percentage 15621 1726882621.34244: checking to see if all hosts have failed and the running result is not ok 15621 1726882621.34245: done checking to see if all hosts have failed 15621 1726882621.34246: getting the remaining hosts for this loop 15621 1726882621.34248: done getting the remaining hosts for this loop 15621 1726882621.34252: getting the next task for host managed_node3 15621 1726882621.34258: done getting next task for host managed_node3 15621 1726882621.34262: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882621.34264: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882621.34283: getting variables 15621 1726882621.34285: in VariableManager get_vars() 15621 1726882621.34442: Calling all_inventory to load vars for managed_node3 15621 1726882621.34446: Calling groups_inventory to load vars for managed_node3 15621 1726882621.34448: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882621.34458: Calling all_plugins_play to load vars for managed_node3 15621 1726882621.34461: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882621.34465: Calling groups_plugins_play to load vars for managed_node3 15621 1726882621.36544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882621.38936: done with get_vars() 15621 1726882621.38965: done getting variables 15621 1726882621.39051: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:01 -0400 (0:00:00.071) 0:00:53.470 ****** 15621 1726882621.39087: entering _queue_task() for managed_node3/service 15621 1726882621.39502: worker is 1 (out of 1 available) 15621 1726882621.39515: exiting _queue_task() for managed_node3/service 15621 1726882621.39735: done queuing things up, now waiting for results queue to drain 15621 1726882621.39737: waiting for pending results... 15621 1726882621.39940: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15621 1726882621.39992: in run() - task 0affc7ec-ae25-af1a-5b92-000000000067 15621 1726882621.40014: variable 'ansible_search_path' from source: unknown 15621 1726882621.40063: variable 'ansible_search_path' from source: unknown 15621 1726882621.40086: calling self._execute() 15621 1726882621.40205: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.40218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.40236: variable 'omit' from source: magic vars 15621 1726882621.40723: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.40730: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.40870: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.41132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882621.43979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882621.44031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882621.44075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882621.44135: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882621.44193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882621.44282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.44337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.44411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.44441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.44464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.44531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.44567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.44603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.44666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.44727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.44751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.44791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.44825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.44890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.44910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.45178: variable 'network_connections' from source: play vars 15621 1726882621.45187: variable 'profile' from source: play vars 15621 1726882621.45268: variable 'profile' from source: play vars 15621 1726882621.45310: variable 'interface' from source: set_fact 15621 1726882621.45400: variable 'interface' from source: set_fact 15621 1726882621.45491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882621.45757: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882621.45840: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882621.45848: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882621.45929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882621.45932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882621.45951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882621.45979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.46006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882621.46064: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882621.46311: variable 'network_connections' from source: play vars 15621 1726882621.46316: variable 'profile' from source: play vars 15621 1726882621.46384: variable 'profile' from source: play vars 15621 1726882621.46388: variable 'interface' from source: set_fact 15621 1726882621.46491: variable 'interface' from source: set_fact 15621 1726882621.46494: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15621 1726882621.46497: when evaluation is False, skipping this task 15621 1726882621.46499: _execute() done 15621 1726882621.46501: dumping result to json 15621 1726882621.46505: done dumping result, returning 15621 1726882621.46507: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-af1a-5b92-000000000067] 15621 1726882621.46518: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000067 15621 1726882621.46598: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000067 15621 1726882621.46601: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15621 1726882621.46688: no more pending results, returning what we have 15621 1726882621.46691: results queue empty 15621 1726882621.46692: checking for any_errors_fatal 15621 1726882621.46699: done checking for any_errors_fatal 15621 1726882621.46700: checking for max_fail_percentage 15621 1726882621.46703: done checking for max_fail_percentage 15621 1726882621.46704: checking to see if all hosts have failed and the running result is not ok 15621 1726882621.46704: done checking to see if all hosts have failed 15621 1726882621.46705: getting the remaining hosts for this loop 15621 1726882621.46707: done getting the remaining hosts for this loop 15621 1726882621.46711: getting the next task for host managed_node3 15621 1726882621.46716: done getting next task for host managed_node3 15621 1726882621.46720: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882621.46853: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882621.46868: getting variables 15621 1726882621.46869: in VariableManager get_vars() 15621 1726882621.46915: Calling all_inventory to load vars for managed_node3 15621 1726882621.46919: Calling groups_inventory to load vars for managed_node3 15621 1726882621.46923: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882621.46934: Calling all_plugins_play to load vars for managed_node3 15621 1726882621.46942: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882621.46946: Calling groups_plugins_play to load vars for managed_node3 15621 1726882621.47940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882621.49519: done with get_vars() 15621 1726882621.49548: done getting variables 15621 1726882621.49624: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:01 -0400 (0:00:00.105) 0:00:53.575 ****** 15621 1726882621.49657: entering _queue_task() for managed_node3/service 15621 1726882621.50040: worker is 1 (out of 1 available) 15621 1726882621.50054: exiting _queue_task() for managed_node3/service 15621 1726882621.50067: done queuing things up, now waiting for results queue to drain 15621 1726882621.50068: waiting for pending results... 15621 1726882621.50423: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15621 1726882621.50480: in run() - task 0affc7ec-ae25-af1a-5b92-000000000068 15621 1726882621.50485: variable 'ansible_search_path' from source: unknown 15621 1726882621.50488: variable 'ansible_search_path' from source: unknown 15621 1726882621.50552: calling self._execute() 15621 1726882621.50629: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.50633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.50637: variable 'omit' from source: magic vars 15621 1726882621.51096: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.51102: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882621.51224: variable 'network_provider' from source: set_fact 15621 1726882621.51228: variable 'network_state' from source: role '' defaults 15621 1726882621.51231: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15621 1726882621.51235: variable 'omit' from source: magic vars 15621 1726882621.51263: variable 'omit' from source: magic vars 15621 1726882621.51295: variable 'network_service_name' from source: role '' defaults 15621 1726882621.51370: variable 'network_service_name' from source: role '' defaults 15621 1726882621.51479: variable '__network_provider_setup' from source: role '' defaults 15621 1726882621.51483: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882621.51566: variable '__network_service_name_default_nm' from source: role '' defaults 15621 1726882621.51570: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882621.51642: variable '__network_packages_default_nm' from source: role '' defaults 15621 1726882621.51892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882621.53531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882621.53588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882621.53623: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882621.53651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882621.53671: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882621.53742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.53763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.53784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.53814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.53828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.53866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.53884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.53901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.53971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.53977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.54249: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15621 1726882621.54428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.54431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.54434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.54437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.54439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.54505: variable 'ansible_python' from source: facts 15621 1726882621.54530: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15621 1726882621.54616: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882621.54728: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882621.54831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.54856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.54884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.54924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.54940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.55015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882621.55028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882621.55103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.55106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882621.55109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882621.55237: variable 'network_connections' from source: play vars 15621 1726882621.55247: variable 'profile' from source: play vars 15621 1726882621.55321: variable 'profile' from source: play vars 15621 1726882621.55335: variable 'interface' from source: set_fact 15621 1726882621.55393: variable 'interface' from source: set_fact 15621 1726882621.55478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882621.55625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882621.55666: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882621.55707: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882621.55738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882621.55790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882621.55814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882621.55840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882621.55865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882621.55908: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.56104: variable 'network_connections' from source: play vars 15621 1726882621.56108: variable 'profile' from source: play vars 15621 1726882621.56168: variable 'profile' from source: play vars 15621 1726882621.56171: variable 'interface' from source: set_fact 15621 1726882621.56220: variable 'interface' from source: set_fact 15621 1726882621.56249: variable '__network_packages_default_wireless' from source: role '' defaults 15621 1726882621.56307: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882621.56514: variable 'network_connections' from source: play vars 15621 1726882621.56518: variable 'profile' from source: play vars 15621 1726882621.56576: variable 'profile' from source: play vars 15621 1726882621.56579: variable 'interface' from source: set_fact 15621 1726882621.56634: variable 'interface' from source: set_fact 15621 1726882621.56654: variable '__network_packages_default_team' from source: role '' defaults 15621 1726882621.56715: variable '__network_team_connections_defined' from source: role '' defaults 15621 1726882621.56919: variable 'network_connections' from source: play vars 15621 1726882621.56924: variable 'profile' from source: play vars 15621 1726882621.56979: variable 'profile' from source: play vars 15621 1726882621.56982: variable 'interface' from source: set_fact 15621 1726882621.57041: variable 'interface' from source: set_fact 15621 1726882621.57083: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882621.57132: variable '__network_service_name_default_initscripts' from source: role '' defaults 15621 1726882621.57138: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882621.57184: variable '__network_packages_default_initscripts' from source: role '' defaults 15621 1726882621.57335: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15621 1726882621.57685: variable 'network_connections' from source: play vars 15621 1726882621.57688: variable 'profile' from source: play vars 15621 1726882621.57736: variable 'profile' from source: play vars 15621 1726882621.57740: variable 'interface' from source: set_fact 15621 1726882621.57795: variable 'interface' from source: set_fact 15621 1726882621.57802: variable 'ansible_distribution' from source: facts 15621 1726882621.57805: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.57811: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.57826: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15621 1726882621.57949: variable 'ansible_distribution' from source: facts 15621 1726882621.57952: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.57957: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.57963: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15621 1726882621.58088: variable 'ansible_distribution' from source: facts 15621 1726882621.58091: variable '__network_rh_distros' from source: role '' defaults 15621 1726882621.58094: variable 'ansible_distribution_major_version' from source: facts 15621 1726882621.58123: variable 'network_provider' from source: set_fact 15621 1726882621.58142: variable 'omit' from source: magic vars 15621 1726882621.58167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882621.58193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882621.58209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882621.58224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882621.58233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882621.58260: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882621.58263: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.58265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.58344: Set connection var ansible_connection to ssh 15621 1726882621.58351: Set connection var ansible_shell_executable to /bin/sh 15621 1726882621.58357: Set connection var ansible_timeout to 10 15621 1726882621.58360: Set connection var ansible_shell_type to sh 15621 1726882621.58365: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882621.58370: Set connection var ansible_pipelining to False 15621 1726882621.58394: variable 'ansible_shell_executable' from source: unknown 15621 1726882621.58397: variable 'ansible_connection' from source: unknown 15621 1726882621.58400: variable 'ansible_module_compression' from source: unknown 15621 1726882621.58404: variable 'ansible_shell_type' from source: unknown 15621 1726882621.58407: variable 'ansible_shell_executable' from source: unknown 15621 1726882621.58410: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882621.58417: variable 'ansible_pipelining' from source: unknown 15621 1726882621.58419: variable 'ansible_timeout' from source: unknown 15621 1726882621.58423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882621.58502: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882621.58511: variable 'omit' from source: magic vars 15621 1726882621.58516: starting attempt loop 15621 1726882621.58519: running the handler 15621 1726882621.58583: variable 'ansible_facts' from source: unknown 15621 1726882621.59176: _low_level_execute_command(): starting 15621 1726882621.59180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882621.59782: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882621.59788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.59791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882621.59793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882621.59796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.59843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882621.59846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882621.59848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882621.59947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882621.62039: stdout chunk (state=3): >>>/root <<< 15621 1726882621.62043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882621.62046: stdout chunk (state=3): >>><<< 15621 1726882621.62048: stderr chunk (state=3): >>><<< 15621 1726882621.62050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882621.62054: _low_level_execute_command(): starting 15621 1726882621.62056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144 `" && echo ansible-tmp-1726882621.620016-17564-259741631254144="` echo /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144 `" ) && sleep 0' 15621 1726882621.62744: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.62805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882621.62815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882621.62839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882621.62957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882621.64971: stdout chunk (state=3): >>>ansible-tmp-1726882621.620016-17564-259741631254144=/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144 <<< 15621 1726882621.65083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882621.65135: stderr chunk (state=3): >>><<< 15621 1726882621.65139: stdout chunk (state=3): >>><<< 15621 1726882621.65158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882621.620016-17564-259741631254144=/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882621.65191: variable 'ansible_module_compression' from source: unknown 15621 1726882621.65233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15621 1726882621.65292: variable 'ansible_facts' from source: unknown 15621 1726882621.65434: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py 15621 1726882621.65554: Sending initial data 15621 1726882621.65557: Sent initial data (155 bytes) 15621 1726882621.66025: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882621.66064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882621.66067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882621.66070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882621.66072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.66121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882621.66128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882621.66135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882621.66216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882621.67855: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882621.67934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882621.68021: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp0d4weltb /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py <<< 15621 1726882621.68026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py" <<< 15621 1726882621.68103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp0d4weltb" to remote "/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py" <<< 15621 1726882621.69456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882621.69535: stderr chunk (state=3): >>><<< 15621 1726882621.69538: stdout chunk (state=3): >>><<< 15621 1726882621.69558: done transferring module to remote 15621 1726882621.69569: _low_level_execute_command(): starting 15621 1726882621.69574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/ /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py && sleep 0' 15621 1726882621.70076: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882621.70079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882621.70082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.70084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882621.70086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882621.70088: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.70148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882621.70155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882621.70159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882621.70240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882621.72106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882621.72164: stderr chunk (state=3): >>><<< 15621 1726882621.72169: stdout chunk (state=3): >>><<< 15621 1726882621.72184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882621.72189: _low_level_execute_command(): starting 15621 1726882621.72191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/AnsiballZ_systemd.py && sleep 0' 15621 1726882621.72693: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882621.72697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.72699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882621.72706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882621.72754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882621.72758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882621.72762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882621.72854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882622.05108: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11804672", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3536916480", "CPUUsageNSec": "1889904000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 15621 1726882622.05120: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15621 1726882622.07253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882622.07454: stderr chunk (state=3): >>><<< 15621 1726882622.07458: stdout chunk (state=3): >>><<< 15621 1726882622.07462: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "685", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ExecMainStartTimestampMonotonic": "45437073", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "685", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3550", "MemoryCurrent": "11804672", "MemoryPeak": "13709312", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3536916480", "CPUUsageNSec": "1889904000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket system.slice basic.target dbus-broker.service dbus.socket network-pre.target cloud-init-local.service sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:22 EDT", "StateChangeTimestampMonotonic": "486988773", "InactiveExitTimestamp": "Fri 2024-09-20 21:25:00 EDT", "InactiveExitTimestampMonotonic": "45437210", "ActiveEnterTimestamp": "Fri 2024-09-20 21:25:02 EDT", "ActiveEnterTimestampMonotonic": "47371748", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:25:00 EDT", "ConditionTimestampMonotonic": "45429688", "AssertTimestamp": "Fri 2024-09-20 21:25:00 EDT", "AssertTimestampMonotonic": "45429690", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6a93edddfc3744e5bee117df30fc836d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882622.07606: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882622.07636: _low_level_execute_command(): starting 15621 1726882622.07645: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882621.620016-17564-259741631254144/ > /dev/null 2>&1 && sleep 0' 15621 1726882622.08380: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882622.08445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882622.08517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882622.08548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882622.08577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882622.08698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882622.10747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882622.10765: stderr chunk (state=3): >>><<< 15621 1726882622.10773: stdout chunk (state=3): >>><<< 15621 1726882622.10797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882622.10846: handler run complete 15621 1726882622.10924: attempt loop complete, returning result 15621 1726882622.10935: _execute() done 15621 1726882622.10948: dumping result to json 15621 1726882622.11029: done dumping result, returning 15621 1726882622.11032: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-af1a-5b92-000000000068] 15621 1726882622.11035: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000068 15621 1726882622.11562: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000068 15621 1726882622.11567: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882622.11635: no more pending results, returning what we have 15621 1726882622.11638: results queue empty 15621 1726882622.11639: checking for any_errors_fatal 15621 1726882622.11646: done checking for any_errors_fatal 15621 1726882622.11647: checking for max_fail_percentage 15621 1726882622.11649: done checking for max_fail_percentage 15621 1726882622.11650: checking to see if all hosts have failed and the running result is not ok 15621 1726882622.11651: done checking to see if all hosts have failed 15621 1726882622.11651: getting the remaining hosts for this loop 15621 1726882622.11653: done getting the remaining hosts for this loop 15621 1726882622.11657: getting the next task for host managed_node3 15621 1726882622.11663: done getting next task for host managed_node3 15621 1726882622.11666: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882622.11668: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882622.11680: getting variables 15621 1726882622.11682: in VariableManager get_vars() 15621 1726882622.11830: Calling all_inventory to load vars for managed_node3 15621 1726882622.11833: Calling groups_inventory to load vars for managed_node3 15621 1726882622.11835: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882622.11845: Calling all_plugins_play to load vars for managed_node3 15621 1726882622.11848: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882622.11855: Calling groups_plugins_play to load vars for managed_node3 15621 1726882622.13667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882622.16052: done with get_vars() 15621 1726882622.16097: done getting variables 15621 1726882622.16166: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:02 -0400 (0:00:00.665) 0:00:54.241 ****** 15621 1726882622.16206: entering _queue_task() for managed_node3/service 15621 1726882622.16609: worker is 1 (out of 1 available) 15621 1726882622.16731: exiting _queue_task() for managed_node3/service 15621 1726882622.16743: done queuing things up, now waiting for results queue to drain 15621 1726882622.16745: waiting for pending results... 15621 1726882622.17110: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15621 1726882622.17116: in run() - task 0affc7ec-ae25-af1a-5b92-000000000069 15621 1726882622.17120: variable 'ansible_search_path' from source: unknown 15621 1726882622.17125: variable 'ansible_search_path' from source: unknown 15621 1726882622.17212: calling self._execute() 15621 1726882622.17301: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.17306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.17317: variable 'omit' from source: magic vars 15621 1726882622.17758: variable 'ansible_distribution_major_version' from source: facts 15621 1726882622.17771: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882622.17911: variable 'network_provider' from source: set_fact 15621 1726882622.17967: Evaluated conditional (network_provider == "nm"): True 15621 1726882622.18025: variable '__network_wpa_supplicant_required' from source: role '' defaults 15621 1726882622.18128: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15621 1726882622.18328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882622.22616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882622.22695: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882622.22757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882622.22782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882622.22810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882622.22913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882622.22946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882622.22981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882622.23081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882622.23085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882622.23096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882622.23120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882622.23145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882622.23194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882622.23207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882622.23257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882622.23297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882622.23326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882622.23365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882622.23385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882622.23574: variable 'network_connections' from source: play vars 15621 1726882622.23591: variable 'profile' from source: play vars 15621 1726882622.23691: variable 'profile' from source: play vars 15621 1726882622.23695: variable 'interface' from source: set_fact 15621 1726882622.23845: variable 'interface' from source: set_fact 15621 1726882622.23889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15621 1726882622.24358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15621 1726882622.24362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15621 1726882622.24365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15621 1726882622.24367: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15621 1726882622.24535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15621 1726882622.24558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15621 1726882622.24594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882622.24671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15621 1726882622.24821: variable '__network_wireless_connections_defined' from source: role '' defaults 15621 1726882622.25443: variable 'network_connections' from source: play vars 15621 1726882622.25518: variable 'profile' from source: play vars 15621 1726882622.25602: variable 'profile' from source: play vars 15621 1726882622.25606: variable 'interface' from source: set_fact 15621 1726882622.25804: variable 'interface' from source: set_fact 15621 1726882622.25839: Evaluated conditional (__network_wpa_supplicant_required): False 15621 1726882622.25866: when evaluation is False, skipping this task 15621 1726882622.25901: _execute() done 15621 1726882622.25969: dumping result to json 15621 1726882622.25973: done dumping result, returning 15621 1726882622.25975: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-af1a-5b92-000000000069] 15621 1726882622.25978: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000069 15621 1726882622.26177: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000069 15621 1726882622.26182: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15621 1726882622.26241: no more pending results, returning what we have 15621 1726882622.26245: results queue empty 15621 1726882622.26246: checking for any_errors_fatal 15621 1726882622.26268: done checking for any_errors_fatal 15621 1726882622.26269: checking for max_fail_percentage 15621 1726882622.26271: done checking for max_fail_percentage 15621 1726882622.26271: checking to see if all hosts have failed and the running result is not ok 15621 1726882622.26275: done checking to see if all hosts have failed 15621 1726882622.26276: getting the remaining hosts for this loop 15621 1726882622.26278: done getting the remaining hosts for this loop 15621 1726882622.26282: getting the next task for host managed_node3 15621 1726882622.26289: done getting next task for host managed_node3 15621 1726882622.26293: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882622.26295: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882622.26311: getting variables 15621 1726882622.26313: in VariableManager get_vars() 15621 1726882622.26358: Calling all_inventory to load vars for managed_node3 15621 1726882622.26361: Calling groups_inventory to load vars for managed_node3 15621 1726882622.26364: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882622.26380: Calling all_plugins_play to load vars for managed_node3 15621 1726882622.26383: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882622.26387: Calling groups_plugins_play to load vars for managed_node3 15621 1726882622.29160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882622.36684: done with get_vars() 15621 1726882622.36843: done getting variables 15621 1726882622.36917: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:02 -0400 (0:00:00.208) 0:00:54.449 ****** 15621 1726882622.37038: entering _queue_task() for managed_node3/service 15621 1726882622.37780: worker is 1 (out of 1 available) 15621 1726882622.37795: exiting _queue_task() for managed_node3/service 15621 1726882622.37808: done queuing things up, now waiting for results queue to drain 15621 1726882622.37810: waiting for pending results... 15621 1726882622.38598: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15621 1726882622.38786: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006a 15621 1726882622.38791: variable 'ansible_search_path' from source: unknown 15621 1726882622.38794: variable 'ansible_search_path' from source: unknown 15621 1726882622.38797: calling self._execute() 15621 1726882622.38895: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.38911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.38928: variable 'omit' from source: magic vars 15621 1726882622.39365: variable 'ansible_distribution_major_version' from source: facts 15621 1726882622.39388: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882622.39532: variable 'network_provider' from source: set_fact 15621 1726882622.39550: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882622.39558: when evaluation is False, skipping this task 15621 1726882622.39569: _execute() done 15621 1726882622.39579: dumping result to json 15621 1726882622.39786: done dumping result, returning 15621 1726882622.39790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-af1a-5b92-00000000006a] 15621 1726882622.39792: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006a 15621 1726882622.39864: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006a 15621 1726882622.39868: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15621 1726882622.40065: no more pending results, returning what we have 15621 1726882622.40069: results queue empty 15621 1726882622.40070: checking for any_errors_fatal 15621 1726882622.40078: done checking for any_errors_fatal 15621 1726882622.40079: checking for max_fail_percentage 15621 1726882622.40081: done checking for max_fail_percentage 15621 1726882622.40082: checking to see if all hosts have failed and the running result is not ok 15621 1726882622.40083: done checking to see if all hosts have failed 15621 1726882622.40084: getting the remaining hosts for this loop 15621 1726882622.40086: done getting the remaining hosts for this loop 15621 1726882622.40090: getting the next task for host managed_node3 15621 1726882622.40095: done getting next task for host managed_node3 15621 1726882622.40100: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882622.40102: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882622.40118: getting variables 15621 1726882622.40119: in VariableManager get_vars() 15621 1726882622.40163: Calling all_inventory to load vars for managed_node3 15621 1726882622.40166: Calling groups_inventory to load vars for managed_node3 15621 1726882622.40169: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882622.40180: Calling all_plugins_play to load vars for managed_node3 15621 1726882622.40183: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882622.40187: Calling groups_plugins_play to load vars for managed_node3 15621 1726882622.44531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882622.48783: done with get_vars() 15621 1726882622.48827: done getting variables 15621 1726882622.48896: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:02 -0400 (0:00:00.118) 0:00:54.568 ****** 15621 1726882622.48940: entering _queue_task() for managed_node3/copy 15621 1726882622.49342: worker is 1 (out of 1 available) 15621 1726882622.49356: exiting _queue_task() for managed_node3/copy 15621 1726882622.49482: done queuing things up, now waiting for results queue to drain 15621 1726882622.49484: waiting for pending results... 15621 1726882622.49770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15621 1726882622.50032: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006b 15621 1726882622.50036: variable 'ansible_search_path' from source: unknown 15621 1726882622.50039: variable 'ansible_search_path' from source: unknown 15621 1726882622.50042: calling self._execute() 15621 1726882622.50163: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.50178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.50203: variable 'omit' from source: magic vars 15621 1726882622.50671: variable 'ansible_distribution_major_version' from source: facts 15621 1726882622.50696: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882622.50842: variable 'network_provider' from source: set_fact 15621 1726882622.50901: Evaluated conditional (network_provider == "initscripts"): False 15621 1726882622.50904: when evaluation is False, skipping this task 15621 1726882622.50906: _execute() done 15621 1726882622.50908: dumping result to json 15621 1726882622.50911: done dumping result, returning 15621 1726882622.50914: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-af1a-5b92-00000000006b] 15621 1726882622.50916: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006b 15621 1726882622.51134: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006b 15621 1726882622.51138: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15621 1726882622.51191: no more pending results, returning what we have 15621 1726882622.51195: results queue empty 15621 1726882622.51197: checking for any_errors_fatal 15621 1726882622.51202: done checking for any_errors_fatal 15621 1726882622.51203: checking for max_fail_percentage 15621 1726882622.51205: done checking for max_fail_percentage 15621 1726882622.51205: checking to see if all hosts have failed and the running result is not ok 15621 1726882622.51206: done checking to see if all hosts have failed 15621 1726882622.51207: getting the remaining hosts for this loop 15621 1726882622.51209: done getting the remaining hosts for this loop 15621 1726882622.51213: getting the next task for host managed_node3 15621 1726882622.51219: done getting next task for host managed_node3 15621 1726882622.51329: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882622.51332: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882622.51352: getting variables 15621 1726882622.51354: in VariableManager get_vars() 15621 1726882622.51395: Calling all_inventory to load vars for managed_node3 15621 1726882622.51398: Calling groups_inventory to load vars for managed_node3 15621 1726882622.51400: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882622.51413: Calling all_plugins_play to load vars for managed_node3 15621 1726882622.51416: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882622.51419: Calling groups_plugins_play to load vars for managed_node3 15621 1726882622.54186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882622.57082: done with get_vars() 15621 1726882622.57245: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:02 -0400 (0:00:00.085) 0:00:54.653 ****** 15621 1726882622.57447: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882622.58263: worker is 1 (out of 1 available) 15621 1726882622.58276: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15621 1726882622.58288: done queuing things up, now waiting for results queue to drain 15621 1726882622.58290: waiting for pending results... 15621 1726882622.58788: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15621 1726882622.58987: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006c 15621 1726882622.58992: variable 'ansible_search_path' from source: unknown 15621 1726882622.58995: variable 'ansible_search_path' from source: unknown 15621 1726882622.58998: calling self._execute() 15621 1726882622.59489: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.59494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.59505: variable 'omit' from source: magic vars 15621 1726882622.60502: variable 'ansible_distribution_major_version' from source: facts 15621 1726882622.60512: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882622.60516: variable 'omit' from source: magic vars 15621 1726882622.60829: variable 'omit' from source: magic vars 15621 1726882622.61346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15621 1726882622.67397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15621 1726882622.67931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15621 1726882622.67938: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15621 1726882622.68334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15621 1726882622.68338: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15621 1726882622.68342: variable 'network_provider' from source: set_fact 15621 1726882622.68628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15621 1726882622.68658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15621 1726882622.68688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15621 1726882622.68844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15621 1726882622.68875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15621 1726882622.68967: variable 'omit' from source: magic vars 15621 1726882622.69323: variable 'omit' from source: magic vars 15621 1726882622.69557: variable 'network_connections' from source: play vars 15621 1726882622.69570: variable 'profile' from source: play vars 15621 1726882622.69768: variable 'profile' from source: play vars 15621 1726882622.69772: variable 'interface' from source: set_fact 15621 1726882622.70003: variable 'interface' from source: set_fact 15621 1726882622.70131: variable 'omit' from source: magic vars 15621 1726882622.70271: variable '__lsr_ansible_managed' from source: task vars 15621 1726882622.70493: variable '__lsr_ansible_managed' from source: task vars 15621 1726882622.71141: Loaded config def from plugin (lookup/template) 15621 1726882622.71145: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15621 1726882622.71429: File lookup term: get_ansible_managed.j2 15621 1726882622.71433: variable 'ansible_search_path' from source: unknown 15621 1726882622.71436: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15621 1726882622.71439: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15621 1726882622.71441: variable 'ansible_search_path' from source: unknown 15621 1726882622.86153: variable 'ansible_managed' from source: unknown 15621 1726882622.86334: variable 'omit' from source: magic vars 15621 1726882622.86371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882622.86404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882622.86433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882622.86456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882622.86470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882622.86503: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882622.86511: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.86518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.86625: Set connection var ansible_connection to ssh 15621 1726882622.86651: Set connection var ansible_shell_executable to /bin/sh 15621 1726882622.86662: Set connection var ansible_timeout to 10 15621 1726882622.86669: Set connection var ansible_shell_type to sh 15621 1726882622.86679: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882622.86689: Set connection var ansible_pipelining to False 15621 1726882622.86720: variable 'ansible_shell_executable' from source: unknown 15621 1726882622.86732: variable 'ansible_connection' from source: unknown 15621 1726882622.86739: variable 'ansible_module_compression' from source: unknown 15621 1726882622.86752: variable 'ansible_shell_type' from source: unknown 15621 1726882622.86759: variable 'ansible_shell_executable' from source: unknown 15621 1726882622.86765: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882622.86772: variable 'ansible_pipelining' from source: unknown 15621 1726882622.86779: variable 'ansible_timeout' from source: unknown 15621 1726882622.86786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882622.86984: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882622.87009: variable 'omit' from source: magic vars 15621 1726882622.87080: starting attempt loop 15621 1726882622.87083: running the handler 15621 1726882622.87086: _low_level_execute_command(): starting 15621 1726882622.87088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882622.87963: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882622.87986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882622.88083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882622.88126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882622.88145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882622.88260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882622.88616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882622.90296: stdout chunk (state=3): >>>/root <<< 15621 1726882622.90492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882622.90495: stdout chunk (state=3): >>><<< 15621 1726882622.90498: stderr chunk (state=3): >>><<< 15621 1726882622.90524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882622.90544: _low_level_execute_command(): starting 15621 1726882622.90554: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096 `" && echo ansible-tmp-1726882622.905316-17610-50774448401096="` echo /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096 `" ) && sleep 0' 15621 1726882622.91602: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882622.91669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882622.91697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882622.91714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882622.91840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882622.93898: stdout chunk (state=3): >>>ansible-tmp-1726882622.905316-17610-50774448401096=/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096 <<< 15621 1726882622.93997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882622.94110: stderr chunk (state=3): >>><<< 15621 1726882622.94145: stdout chunk (state=3): >>><<< 15621 1726882622.94326: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882622.905316-17610-50774448401096=/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882622.94540: variable 'ansible_module_compression' from source: unknown 15621 1726882622.94664: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15621 1726882622.94737: variable 'ansible_facts' from source: unknown 15621 1726882622.95081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py 15621 1726882622.95669: Sending initial data 15621 1726882622.95787: Sent initial data (166 bytes) 15621 1726882622.96531: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882622.96536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882622.96545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882622.96548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882622.96563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882622.96685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882622.98354: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882622.98439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882622.98520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp72ei_cfj /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py <<< 15621 1726882622.98526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py" <<< 15621 1726882622.98600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp72ei_cfj" to remote "/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py" <<< 15621 1726882622.98608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py" <<< 15621 1726882623.00041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882623.00128: stderr chunk (state=3): >>><<< 15621 1726882623.00132: stdout chunk (state=3): >>><<< 15621 1726882623.00253: done transferring module to remote 15621 1726882623.00256: _low_level_execute_command(): starting 15621 1726882623.00259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/ /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py && sleep 0' 15621 1726882623.00807: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882623.00819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882623.00835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882623.00850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882623.00873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882623.00884: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882623.00982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882623.01001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882623.01118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882623.03096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882623.03123: stderr chunk (state=3): >>><<< 15621 1726882623.03127: stdout chunk (state=3): >>><<< 15621 1726882623.03149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882623.03152: _low_level_execute_command(): starting 15621 1726882623.03155: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/AnsiballZ_network_connections.py && sleep 0' 15621 1726882623.03733: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882623.03740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882623.03754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882623.03849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882623.03894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882623.03990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882623.32943: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__4b7vjt2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__4b7vjt2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/8aca4925-38c7-45a9-b2be-84b83d56f24f: error=unknown <<< 15621 1726882623.33099: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15621 1726882623.35238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882623.35257: stderr chunk (state=3): >>>Shared connection to 10.31.45.226 closed. <<< 15621 1726882623.35304: stderr chunk (state=3): >>><<< 15621 1726882623.35318: stdout chunk (state=3): >>><<< 15621 1726882623.35352: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__4b7vjt2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__4b7vjt2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/8aca4925-38c7-45a9-b2be-84b83d56f24f: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882623.35398: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882623.35448: _low_level_execute_command(): starting 15621 1726882623.35451: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882622.905316-17610-50774448401096/ > /dev/null 2>&1 && sleep 0' 15621 1726882623.36140: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882623.36154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882623.36249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882623.36283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882623.36303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882623.36335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882623.36471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882623.38620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882623.38678: stderr chunk (state=3): >>><<< 15621 1726882623.38681: stdout chunk (state=3): >>><<< 15621 1726882623.38684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882623.38686: handler run complete 15621 1726882623.38761: attempt loop complete, returning result 15621 1726882623.38769: _execute() done 15621 1726882623.38791: dumping result to json 15621 1726882623.38928: done dumping result, returning 15621 1726882623.38932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-af1a-5b92-00000000006c] 15621 1726882623.38939: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006c changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15621 1726882623.39255: no more pending results, returning what we have 15621 1726882623.39261: results queue empty 15621 1726882623.39262: checking for any_errors_fatal 15621 1726882623.39271: done checking for any_errors_fatal 15621 1726882623.39272: checking for max_fail_percentage 15621 1726882623.39276: done checking for max_fail_percentage 15621 1726882623.39277: checking to see if all hosts have failed and the running result is not ok 15621 1726882623.39278: done checking to see if all hosts have failed 15621 1726882623.39279: getting the remaining hosts for this loop 15621 1726882623.39281: done getting the remaining hosts for this loop 15621 1726882623.39286: getting the next task for host managed_node3 15621 1726882623.39293: done getting next task for host managed_node3 15621 1726882623.39297: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882623.39299: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882623.39309: getting variables 15621 1726882623.39311: in VariableManager get_vars() 15621 1726882623.39467: Calling all_inventory to load vars for managed_node3 15621 1726882623.39470: Calling groups_inventory to load vars for managed_node3 15621 1726882623.39472: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882623.39534: Calling all_plugins_play to load vars for managed_node3 15621 1726882623.39538: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882623.39553: Calling groups_plugins_play to load vars for managed_node3 15621 1726882623.39564: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006c 15621 1726882623.39566: WORKER PROCESS EXITING 15621 1726882623.42148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882623.44862: done with get_vars() 15621 1726882623.44907: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:03 -0400 (0:00:00.875) 0:00:55.529 ****** 15621 1726882623.45011: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882623.45569: worker is 1 (out of 1 available) 15621 1726882623.45585: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15621 1726882623.45595: done queuing things up, now waiting for results queue to drain 15621 1726882623.45597: waiting for pending results... 15621 1726882623.45832: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15621 1726882623.46130: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006d 15621 1726882623.46133: variable 'ansible_search_path' from source: unknown 15621 1726882623.46136: variable 'ansible_search_path' from source: unknown 15621 1726882623.46139: calling self._execute() 15621 1726882623.46141: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.46144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.46158: variable 'omit' from source: magic vars 15621 1726882623.46604: variable 'ansible_distribution_major_version' from source: facts 15621 1726882623.46631: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882623.46779: variable 'network_state' from source: role '' defaults 15621 1726882623.46798: Evaluated conditional (network_state != {}): False 15621 1726882623.46805: when evaluation is False, skipping this task 15621 1726882623.46813: _execute() done 15621 1726882623.46819: dumping result to json 15621 1726882623.46830: done dumping result, returning 15621 1726882623.46848: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-af1a-5b92-00000000006d] 15621 1726882623.46860: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15621 1726882623.47119: no more pending results, returning what we have 15621 1726882623.47126: results queue empty 15621 1726882623.47127: checking for any_errors_fatal 15621 1726882623.47139: done checking for any_errors_fatal 15621 1726882623.47140: checking for max_fail_percentage 15621 1726882623.47142: done checking for max_fail_percentage 15621 1726882623.47142: checking to see if all hosts have failed and the running result is not ok 15621 1726882623.47144: done checking to see if all hosts have failed 15621 1726882623.47145: getting the remaining hosts for this loop 15621 1726882623.47146: done getting the remaining hosts for this loop 15621 1726882623.47151: getting the next task for host managed_node3 15621 1726882623.47162: done getting next task for host managed_node3 15621 1726882623.47167: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882623.47170: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882623.47288: getting variables 15621 1726882623.47290: in VariableManager get_vars() 15621 1726882623.47357: Calling all_inventory to load vars for managed_node3 15621 1726882623.47360: Calling groups_inventory to load vars for managed_node3 15621 1726882623.47362: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882623.47387: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006d 15621 1726882623.47390: WORKER PROCESS EXITING 15621 1726882623.47403: Calling all_plugins_play to load vars for managed_node3 15621 1726882623.47406: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882623.47409: Calling groups_plugins_play to load vars for managed_node3 15621 1726882623.49807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882623.52395: done with get_vars() 15621 1726882623.52438: done getting variables 15621 1726882623.52512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:03 -0400 (0:00:00.075) 0:00:55.604 ****** 15621 1726882623.52551: entering _queue_task() for managed_node3/debug 15621 1726882623.53539: worker is 1 (out of 1 available) 15621 1726882623.53551: exiting _queue_task() for managed_node3/debug 15621 1726882623.53562: done queuing things up, now waiting for results queue to drain 15621 1726882623.53564: waiting for pending results... 15621 1726882623.53953: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15621 1726882623.54189: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006e 15621 1726882623.54207: variable 'ansible_search_path' from source: unknown 15621 1726882623.54211: variable 'ansible_search_path' from source: unknown 15621 1726882623.54366: calling self._execute() 15621 1726882623.54371: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.54378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.54381: variable 'omit' from source: magic vars 15621 1726882623.55382: variable 'ansible_distribution_major_version' from source: facts 15621 1726882623.55508: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882623.55514: variable 'omit' from source: magic vars 15621 1726882623.55563: variable 'omit' from source: magic vars 15621 1726882623.55627: variable 'omit' from source: magic vars 15621 1726882623.55769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882623.55809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882623.56008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882623.56011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.56014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.56016: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882623.56019: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.56021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.56298: Set connection var ansible_connection to ssh 15621 1726882623.56307: Set connection var ansible_shell_executable to /bin/sh 15621 1726882623.56313: Set connection var ansible_timeout to 10 15621 1726882623.56316: Set connection var ansible_shell_type to sh 15621 1726882623.56324: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882623.56330: Set connection var ansible_pipelining to False 15621 1726882623.56359: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.56363: variable 'ansible_connection' from source: unknown 15621 1726882623.56366: variable 'ansible_module_compression' from source: unknown 15621 1726882623.56556: variable 'ansible_shell_type' from source: unknown 15621 1726882623.56561: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.56563: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.56565: variable 'ansible_pipelining' from source: unknown 15621 1726882623.56568: variable 'ansible_timeout' from source: unknown 15621 1726882623.56570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.56775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882623.56827: variable 'omit' from source: magic vars 15621 1726882623.56831: starting attempt loop 15621 1726882623.56834: running the handler 15621 1726882623.57165: variable '__network_connections_result' from source: set_fact 15621 1726882623.57220: handler run complete 15621 1726882623.57337: attempt loop complete, returning result 15621 1726882623.57341: _execute() done 15621 1726882623.57348: dumping result to json 15621 1726882623.57354: done dumping result, returning 15621 1726882623.57365: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000006e] 15621 1726882623.57371: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006e 15621 1726882623.57484: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006e 15621 1726882623.57489: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 15621 1726882623.57562: no more pending results, returning what we have 15621 1726882623.57566: results queue empty 15621 1726882623.57567: checking for any_errors_fatal 15621 1726882623.57576: done checking for any_errors_fatal 15621 1726882623.57577: checking for max_fail_percentage 15621 1726882623.57578: done checking for max_fail_percentage 15621 1726882623.57579: checking to see if all hosts have failed and the running result is not ok 15621 1726882623.57581: done checking to see if all hosts have failed 15621 1726882623.57581: getting the remaining hosts for this loop 15621 1726882623.57583: done getting the remaining hosts for this loop 15621 1726882623.57588: getting the next task for host managed_node3 15621 1726882623.57594: done getting next task for host managed_node3 15621 1726882623.57598: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882623.57601: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882623.57613: getting variables 15621 1726882623.57615: in VariableManager get_vars() 15621 1726882623.57659: Calling all_inventory to load vars for managed_node3 15621 1726882623.57662: Calling groups_inventory to load vars for managed_node3 15621 1726882623.57665: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882623.57678: Calling all_plugins_play to load vars for managed_node3 15621 1726882623.57681: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882623.57685: Calling groups_plugins_play to load vars for managed_node3 15621 1726882623.61403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882623.65732: done with get_vars() 15621 1726882623.65773: done getting variables 15621 1726882623.66052: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:03 -0400 (0:00:00.135) 0:00:55.740 ****** 15621 1726882623.66088: entering _queue_task() for managed_node3/debug 15621 1726882623.66682: worker is 1 (out of 1 available) 15621 1726882623.66696: exiting _queue_task() for managed_node3/debug 15621 1726882623.66710: done queuing things up, now waiting for results queue to drain 15621 1726882623.66712: waiting for pending results... 15621 1726882623.67454: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15621 1726882623.67501: in run() - task 0affc7ec-ae25-af1a-5b92-00000000006f 15621 1726882623.67629: variable 'ansible_search_path' from source: unknown 15621 1726882623.67633: variable 'ansible_search_path' from source: unknown 15621 1726882623.67679: calling self._execute() 15621 1726882623.67907: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.67923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.68094: variable 'omit' from source: magic vars 15621 1726882623.68855: variable 'ansible_distribution_major_version' from source: facts 15621 1726882623.69028: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882623.69032: variable 'omit' from source: magic vars 15621 1726882623.69048: variable 'omit' from source: magic vars 15621 1726882623.69102: variable 'omit' from source: magic vars 15621 1726882623.69232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882623.69284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882623.69347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882623.69371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.69390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.69438: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882623.69447: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.69456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.69592: Set connection var ansible_connection to ssh 15621 1726882623.69609: Set connection var ansible_shell_executable to /bin/sh 15621 1726882623.69630: Set connection var ansible_timeout to 10 15621 1726882623.69639: Set connection var ansible_shell_type to sh 15621 1726882623.69657: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882623.69724: Set connection var ansible_pipelining to False 15621 1726882623.69727: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.69730: variable 'ansible_connection' from source: unknown 15621 1726882623.69735: variable 'ansible_module_compression' from source: unknown 15621 1726882623.69737: variable 'ansible_shell_type' from source: unknown 15621 1726882623.69740: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.69742: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.69745: variable 'ansible_pipelining' from source: unknown 15621 1726882623.69750: variable 'ansible_timeout' from source: unknown 15621 1726882623.69892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.69962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882623.69992: variable 'omit' from source: magic vars 15621 1726882623.70005: starting attempt loop 15621 1726882623.70013: running the handler 15621 1726882623.70102: variable '__network_connections_result' from source: set_fact 15621 1726882623.70213: variable '__network_connections_result' from source: set_fact 15621 1726882623.70361: handler run complete 15621 1726882623.70398: attempt loop complete, returning result 15621 1726882623.70465: _execute() done 15621 1726882623.70469: dumping result to json 15621 1726882623.70472: done dumping result, returning 15621 1726882623.70478: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-af1a-5b92-00000000006f] 15621 1726882623.70481: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006f 15621 1726882623.70711: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000006f 15621 1726882623.70716: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15621 1726882623.70804: no more pending results, returning what we have 15621 1726882623.70807: results queue empty 15621 1726882623.70809: checking for any_errors_fatal 15621 1726882623.70813: done checking for any_errors_fatal 15621 1726882623.70814: checking for max_fail_percentage 15621 1726882623.70815: done checking for max_fail_percentage 15621 1726882623.70816: checking to see if all hosts have failed and the running result is not ok 15621 1726882623.70817: done checking to see if all hosts have failed 15621 1726882623.70818: getting the remaining hosts for this loop 15621 1726882623.70819: done getting the remaining hosts for this loop 15621 1726882623.70825: getting the next task for host managed_node3 15621 1726882623.70832: done getting next task for host managed_node3 15621 1726882623.70836: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882623.70838: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882623.70848: getting variables 15621 1726882623.70850: in VariableManager get_vars() 15621 1726882623.70886: Calling all_inventory to load vars for managed_node3 15621 1726882623.70889: Calling groups_inventory to load vars for managed_node3 15621 1726882623.70891: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882623.70901: Calling all_plugins_play to load vars for managed_node3 15621 1726882623.70904: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882623.70906: Calling groups_plugins_play to load vars for managed_node3 15621 1726882623.85336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882623.88596: done with get_vars() 15621 1726882623.88650: done getting variables 15621 1726882623.88732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:03 -0400 (0:00:00.226) 0:00:55.966 ****** 15621 1726882623.88769: entering _queue_task() for managed_node3/debug 15621 1726882623.89240: worker is 1 (out of 1 available) 15621 1726882623.89256: exiting _queue_task() for managed_node3/debug 15621 1726882623.89269: done queuing things up, now waiting for results queue to drain 15621 1726882623.89271: waiting for pending results... 15621 1726882623.89533: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15621 1726882623.89636: in run() - task 0affc7ec-ae25-af1a-5b92-000000000070 15621 1726882623.89647: variable 'ansible_search_path' from source: unknown 15621 1726882623.89650: variable 'ansible_search_path' from source: unknown 15621 1726882623.89693: calling self._execute() 15621 1726882623.89907: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.89912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.89915: variable 'omit' from source: magic vars 15621 1726882623.90343: variable 'ansible_distribution_major_version' from source: facts 15621 1726882623.90347: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882623.90350: variable 'network_state' from source: role '' defaults 15621 1726882623.90366: Evaluated conditional (network_state != {}): False 15621 1726882623.90371: when evaluation is False, skipping this task 15621 1726882623.90376: _execute() done 15621 1726882623.90379: dumping result to json 15621 1726882623.90382: done dumping result, returning 15621 1726882623.90385: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-af1a-5b92-000000000070] 15621 1726882623.90391: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000070 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15621 1726882623.90898: no more pending results, returning what we have 15621 1726882623.90901: results queue empty 15621 1726882623.90902: checking for any_errors_fatal 15621 1726882623.90910: done checking for any_errors_fatal 15621 1726882623.90911: checking for max_fail_percentage 15621 1726882623.90912: done checking for max_fail_percentage 15621 1726882623.90913: checking to see if all hosts have failed and the running result is not ok 15621 1726882623.90914: done checking to see if all hosts have failed 15621 1726882623.90915: getting the remaining hosts for this loop 15621 1726882623.90916: done getting the remaining hosts for this loop 15621 1726882623.90919: getting the next task for host managed_node3 15621 1726882623.90929: done getting next task for host managed_node3 15621 1726882623.90932: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882623.90934: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882623.90947: getting variables 15621 1726882623.90948: in VariableManager get_vars() 15621 1726882623.90987: Calling all_inventory to load vars for managed_node3 15621 1726882623.90989: Calling groups_inventory to load vars for managed_node3 15621 1726882623.90991: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882623.91001: Calling all_plugins_play to load vars for managed_node3 15621 1726882623.91003: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882623.91005: Calling groups_plugins_play to load vars for managed_node3 15621 1726882623.91539: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000070 15621 1726882623.91544: WORKER PROCESS EXITING 15621 1726882623.93001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882623.95299: done with get_vars() 15621 1726882623.95329: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:03 -0400 (0:00:00.066) 0:00:56.033 ****** 15621 1726882623.95433: entering _queue_task() for managed_node3/ping 15621 1726882623.95798: worker is 1 (out of 1 available) 15621 1726882623.95812: exiting _queue_task() for managed_node3/ping 15621 1726882623.95830: done queuing things up, now waiting for results queue to drain 15621 1726882623.95832: waiting for pending results... 15621 1726882623.96140: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15621 1726882623.96275: in run() - task 0affc7ec-ae25-af1a-5b92-000000000071 15621 1726882623.96297: variable 'ansible_search_path' from source: unknown 15621 1726882623.96304: variable 'ansible_search_path' from source: unknown 15621 1726882623.96367: calling self._execute() 15621 1726882623.96467: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.96488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.96504: variable 'omit' from source: magic vars 15621 1726882623.96916: variable 'ansible_distribution_major_version' from source: facts 15621 1726882623.97021: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882623.97027: variable 'omit' from source: magic vars 15621 1726882623.97030: variable 'omit' from source: magic vars 15621 1726882623.97032: variable 'omit' from source: magic vars 15621 1726882623.97070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882623.97119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882623.97150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882623.97175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.97193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882623.97237: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882623.97353: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.97358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.97381: Set connection var ansible_connection to ssh 15621 1726882623.97397: Set connection var ansible_shell_executable to /bin/sh 15621 1726882623.97408: Set connection var ansible_timeout to 10 15621 1726882623.97415: Set connection var ansible_shell_type to sh 15621 1726882623.97428: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882623.97439: Set connection var ansible_pipelining to False 15621 1726882623.97475: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.97484: variable 'ansible_connection' from source: unknown 15621 1726882623.97492: variable 'ansible_module_compression' from source: unknown 15621 1726882623.97500: variable 'ansible_shell_type' from source: unknown 15621 1726882623.97507: variable 'ansible_shell_executable' from source: unknown 15621 1726882623.97515: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882623.97524: variable 'ansible_pipelining' from source: unknown 15621 1726882623.97533: variable 'ansible_timeout' from source: unknown 15621 1726882623.97542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882623.97810: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882623.97832: variable 'omit' from source: magic vars 15621 1726882623.97841: starting attempt loop 15621 1726882623.97848: running the handler 15621 1726882623.97868: _low_level_execute_command(): starting 15621 1726882623.97897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882623.98669: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882623.98744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882623.98808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882623.98830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882623.98846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882623.98970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.00780: stdout chunk (state=3): >>>/root <<< 15621 1726882624.00971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.00978: stdout chunk (state=3): >>><<< 15621 1726882624.00981: stderr chunk (state=3): >>><<< 15621 1726882624.01108: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.01111: _low_level_execute_command(): starting 15621 1726882624.01114: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353 `" && echo ansible-tmp-1726882624.0100615-17675-33288029851353="` echo /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353 `" ) && sleep 0' 15621 1726882624.01629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.01645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882624.01689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.01716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.01719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.01820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.03884: stdout chunk (state=3): >>>ansible-tmp-1726882624.0100615-17675-33288029851353=/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353 <<< 15621 1726882624.04085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.04089: stdout chunk (state=3): >>><<< 15621 1726882624.04091: stderr chunk (state=3): >>><<< 15621 1726882624.04228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882624.0100615-17675-33288029851353=/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.04232: variable 'ansible_module_compression' from source: unknown 15621 1726882624.04235: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15621 1726882624.04268: variable 'ansible_facts' from source: unknown 15621 1726882624.04480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py 15621 1726882624.04693: Sending initial data 15621 1726882624.04697: Sent initial data (152 bytes) 15621 1726882624.05403: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882624.05481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882624.05485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.05554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.05578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.05706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.07374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882624.07469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882624.07602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpbz9mcwgf /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py <<< 15621 1726882624.07605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py" <<< 15621 1726882624.07690: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpbz9mcwgf" to remote "/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py" <<< 15621 1726882624.09034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.09039: stdout chunk (state=3): >>><<< 15621 1726882624.09041: stderr chunk (state=3): >>><<< 15621 1726882624.09043: done transferring module to remote 15621 1726882624.09046: _low_level_execute_command(): starting 15621 1726882624.09048: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/ /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py && sleep 0' 15621 1726882624.09685: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882624.09705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882624.09723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.09794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.09855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.09893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.09962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.10064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.12096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.12100: stdout chunk (state=3): >>><<< 15621 1726882624.12103: stderr chunk (state=3): >>><<< 15621 1726882624.12244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.12250: _low_level_execute_command(): starting 15621 1726882624.12253: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/AnsiballZ_ping.py && sleep 0' 15621 1726882624.13825: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.14002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.14044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.14236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.30342: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15621 1726882624.31778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882624.31837: stderr chunk (state=3): >>><<< 15621 1726882624.31842: stdout chunk (state=3): >>><<< 15621 1726882624.31858: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882624.31881: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882624.31892: _low_level_execute_command(): starting 15621 1726882624.31897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882624.0100615-17675-33288029851353/ > /dev/null 2>&1 && sleep 0' 15621 1726882624.32539: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.32543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.32547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882624.32596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882624.32602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.32670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.32673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.32679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.32774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.34754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.34802: stderr chunk (state=3): >>><<< 15621 1726882624.34805: stdout chunk (state=3): >>><<< 15621 1726882624.34821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.34830: handler run complete 15621 1726882624.34852: attempt loop complete, returning result 15621 1726882624.34855: _execute() done 15621 1726882624.34858: dumping result to json 15621 1726882624.34860: done dumping result, returning 15621 1726882624.34879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-af1a-5b92-000000000071] 15621 1726882624.34882: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000071 15621 1726882624.35025: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000071 15621 1726882624.35028: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15621 1726882624.35108: no more pending results, returning what we have 15621 1726882624.35111: results queue empty 15621 1726882624.35112: checking for any_errors_fatal 15621 1726882624.35118: done checking for any_errors_fatal 15621 1726882624.35119: checking for max_fail_percentage 15621 1726882624.35121: done checking for max_fail_percentage 15621 1726882624.35124: checking to see if all hosts have failed and the running result is not ok 15621 1726882624.35126: done checking to see if all hosts have failed 15621 1726882624.35126: getting the remaining hosts for this loop 15621 1726882624.35128: done getting the remaining hosts for this loop 15621 1726882624.35132: getting the next task for host managed_node3 15621 1726882624.35140: done getting next task for host managed_node3 15621 1726882624.35142: ^ task is: TASK: meta (role_complete) 15621 1726882624.35144: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.35154: getting variables 15621 1726882624.35156: in VariableManager get_vars() 15621 1726882624.35195: Calling all_inventory to load vars for managed_node3 15621 1726882624.35198: Calling groups_inventory to load vars for managed_node3 15621 1726882624.35200: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.35210: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.35212: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.35215: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.36518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.38777: done with get_vars() 15621 1726882624.38813: done getting variables 15621 1726882624.38941: done queuing things up, now waiting for results queue to drain 15621 1726882624.38944: results queue empty 15621 1726882624.38945: checking for any_errors_fatal 15621 1726882624.38948: done checking for any_errors_fatal 15621 1726882624.38949: checking for max_fail_percentage 15621 1726882624.38950: done checking for max_fail_percentage 15621 1726882624.38951: checking to see if all hosts have failed and the running result is not ok 15621 1726882624.38951: done checking to see if all hosts have failed 15621 1726882624.38952: getting the remaining hosts for this loop 15621 1726882624.38953: done getting the remaining hosts for this loop 15621 1726882624.38956: getting the next task for host managed_node3 15621 1726882624.38960: done getting next task for host managed_node3 15621 1726882624.38962: ^ task is: TASK: meta (flush_handlers) 15621 1726882624.38964: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.38967: getting variables 15621 1726882624.38968: in VariableManager get_vars() 15621 1726882624.38980: Calling all_inventory to load vars for managed_node3 15621 1726882624.38981: Calling groups_inventory to load vars for managed_node3 15621 1726882624.38983: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.38987: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.38988: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.38990: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.40430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.42104: done with get_vars() 15621 1726882624.42121: done getting variables 15621 1726882624.42157: in VariableManager get_vars() 15621 1726882624.42167: Calling all_inventory to load vars for managed_node3 15621 1726882624.42168: Calling groups_inventory to load vars for managed_node3 15621 1726882624.42170: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.42175: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.42177: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.42179: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.43255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.44787: done with get_vars() 15621 1726882624.44806: done queuing things up, now waiting for results queue to drain 15621 1726882624.44808: results queue empty 15621 1726882624.44809: checking for any_errors_fatal 15621 1726882624.44810: done checking for any_errors_fatal 15621 1726882624.44810: checking for max_fail_percentage 15621 1726882624.44811: done checking for max_fail_percentage 15621 1726882624.44811: checking to see if all hosts have failed and the running result is not ok 15621 1726882624.44812: done checking to see if all hosts have failed 15621 1726882624.44813: getting the remaining hosts for this loop 15621 1726882624.44813: done getting the remaining hosts for this loop 15621 1726882624.44815: getting the next task for host managed_node3 15621 1726882624.44818: done getting next task for host managed_node3 15621 1726882624.44819: ^ task is: TASK: meta (flush_handlers) 15621 1726882624.44820: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.44824: getting variables 15621 1726882624.44825: in VariableManager get_vars() 15621 1726882624.44834: Calling all_inventory to load vars for managed_node3 15621 1726882624.44835: Calling groups_inventory to load vars for managed_node3 15621 1726882624.44837: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.44841: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.44842: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.44844: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.45659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.47578: done with get_vars() 15621 1726882624.47610: done getting variables 15621 1726882624.47671: in VariableManager get_vars() 15621 1726882624.47685: Calling all_inventory to load vars for managed_node3 15621 1726882624.47687: Calling groups_inventory to load vars for managed_node3 15621 1726882624.47689: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.47694: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.47697: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.47700: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.49231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.51374: done with get_vars() 15621 1726882624.51412: done queuing things up, now waiting for results queue to drain 15621 1726882624.51414: results queue empty 15621 1726882624.51415: checking for any_errors_fatal 15621 1726882624.51417: done checking for any_errors_fatal 15621 1726882624.51417: checking for max_fail_percentage 15621 1726882624.51419: done checking for max_fail_percentage 15621 1726882624.51420: checking to see if all hosts have failed and the running result is not ok 15621 1726882624.51420: done checking to see if all hosts have failed 15621 1726882624.51423: getting the remaining hosts for this loop 15621 1726882624.51424: done getting the remaining hosts for this loop 15621 1726882624.51427: getting the next task for host managed_node3 15621 1726882624.51430: done getting next task for host managed_node3 15621 1726882624.51431: ^ task is: None 15621 1726882624.51433: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.51434: done queuing things up, now waiting for results queue to drain 15621 1726882624.51435: results queue empty 15621 1726882624.51436: checking for any_errors_fatal 15621 1726882624.51436: done checking for any_errors_fatal 15621 1726882624.51437: checking for max_fail_percentage 15621 1726882624.51438: done checking for max_fail_percentage 15621 1726882624.51439: checking to see if all hosts have failed and the running result is not ok 15621 1726882624.51439: done checking to see if all hosts have failed 15621 1726882624.51441: getting the next task for host managed_node3 15621 1726882624.51443: done getting next task for host managed_node3 15621 1726882624.51444: ^ task is: None 15621 1726882624.51445: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.51491: in VariableManager get_vars() 15621 1726882624.51509: done with get_vars() 15621 1726882624.51514: in VariableManager get_vars() 15621 1726882624.51526: done with get_vars() 15621 1726882624.51531: variable 'omit' from source: magic vars 15621 1726882624.51562: in VariableManager get_vars() 15621 1726882624.51574: done with get_vars() 15621 1726882624.51599: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 15621 1726882624.51850: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882624.51875: getting the remaining hosts for this loop 15621 1726882624.51876: done getting the remaining hosts for this loop 15621 1726882624.51879: getting the next task for host managed_node3 15621 1726882624.51881: done getting next task for host managed_node3 15621 1726882624.51884: ^ task is: TASK: Gathering Facts 15621 1726882624.51885: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882624.51887: getting variables 15621 1726882624.51888: in VariableManager get_vars() 15621 1726882624.51896: Calling all_inventory to load vars for managed_node3 15621 1726882624.51899: Calling groups_inventory to load vars for managed_node3 15621 1726882624.51901: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882624.51907: Calling all_plugins_play to load vars for managed_node3 15621 1726882624.51909: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882624.51912: Calling groups_plugins_play to load vars for managed_node3 15621 1726882624.53406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882624.55514: done with get_vars() 15621 1726882624.55540: done getting variables 15621 1726882624.55591: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 21:37:04 -0400 (0:00:00.601) 0:00:56.635 ****** 15621 1726882624.55618: entering _queue_task() for managed_node3/gather_facts 15621 1726882624.55994: worker is 1 (out of 1 available) 15621 1726882624.56007: exiting _queue_task() for managed_node3/gather_facts 15621 1726882624.56020: done queuing things up, now waiting for results queue to drain 15621 1726882624.56224: waiting for pending results... 15621 1726882624.56354: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882624.56453: in run() - task 0affc7ec-ae25-af1a-5b92-0000000004e4 15621 1726882624.56560: variable 'ansible_search_path' from source: unknown 15621 1726882624.56565: calling self._execute() 15621 1726882624.56625: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882624.56638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882624.56652: variable 'omit' from source: magic vars 15621 1726882624.57086: variable 'ansible_distribution_major_version' from source: facts 15621 1726882624.57109: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882624.57123: variable 'omit' from source: magic vars 15621 1726882624.57156: variable 'omit' from source: magic vars 15621 1726882624.57197: variable 'omit' from source: magic vars 15621 1726882624.57251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882624.57296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882624.57328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882624.57425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882624.57430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882624.57433: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882624.57436: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882624.57439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882624.57536: Set connection var ansible_connection to ssh 15621 1726882624.57553: Set connection var ansible_shell_executable to /bin/sh 15621 1726882624.57564: Set connection var ansible_timeout to 10 15621 1726882624.57572: Set connection var ansible_shell_type to sh 15621 1726882624.57583: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882624.57593: Set connection var ansible_pipelining to False 15621 1726882624.57625: variable 'ansible_shell_executable' from source: unknown 15621 1726882624.57640: variable 'ansible_connection' from source: unknown 15621 1726882624.57649: variable 'ansible_module_compression' from source: unknown 15621 1726882624.57658: variable 'ansible_shell_type' from source: unknown 15621 1726882624.57666: variable 'ansible_shell_executable' from source: unknown 15621 1726882624.57729: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882624.57733: variable 'ansible_pipelining' from source: unknown 15621 1726882624.57737: variable 'ansible_timeout' from source: unknown 15621 1726882624.57740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882624.57909: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882624.57931: variable 'omit' from source: magic vars 15621 1726882624.57942: starting attempt loop 15621 1726882624.57949: running the handler 15621 1726882624.57976: variable 'ansible_facts' from source: unknown 15621 1726882624.58004: _low_level_execute_command(): starting 15621 1726882624.58077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882624.58788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882624.58812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882624.58837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.58925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.58966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.58988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.59025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.59148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.60949: stdout chunk (state=3): >>>/root <<< 15621 1726882624.61092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.61116: stderr chunk (state=3): >>><<< 15621 1726882624.61120: stdout chunk (state=3): >>><<< 15621 1726882624.61145: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.61158: _low_level_execute_command(): starting 15621 1726882624.61164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480 `" && echo ansible-tmp-1726882624.6114485-17694-38402412771480="` echo /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480 `" ) && sleep 0' 15621 1726882624.61964: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.61991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.62000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.62003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.62113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.64177: stdout chunk (state=3): >>>ansible-tmp-1726882624.6114485-17694-38402412771480=/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480 <<< 15621 1726882624.64291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.64391: stderr chunk (state=3): >>><<< 15621 1726882624.64395: stdout chunk (state=3): >>><<< 15621 1726882624.64420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882624.6114485-17694-38402412771480=/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.64651: variable 'ansible_module_compression' from source: unknown 15621 1726882624.64654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882624.64656: variable 'ansible_facts' from source: unknown 15621 1726882624.64810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py 15621 1726882624.64994: Sending initial data 15621 1726882624.65004: Sent initial data (153 bytes) 15621 1726882624.65609: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882624.65648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882624.65673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.65691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882624.65709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882624.65725: stderr chunk (state=3): >>>debug2: match not found <<< 15621 1726882624.65742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.65777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15621 1726882624.65812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.65886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882624.65900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.65999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.67745: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15621 1726882624.67775: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882624.67854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882624.67959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpuf4v0aji /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py <<< 15621 1726882624.67962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py" <<< 15621 1726882624.68043: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpuf4v0aji" to remote "/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py" <<< 15621 1726882624.69729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.69798: stderr chunk (state=3): >>><<< 15621 1726882624.69802: stdout chunk (state=3): >>><<< 15621 1726882624.69824: done transferring module to remote 15621 1726882624.69836: _low_level_execute_command(): starting 15621 1726882624.69841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/ /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py && sleep 0' 15621 1726882624.70314: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882624.70318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882624.70321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.70326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.70373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.70382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.70467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882624.72395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882624.72439: stderr chunk (state=3): >>><<< 15621 1726882624.72442: stdout chunk (state=3): >>><<< 15621 1726882624.72455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882624.72458: _low_level_execute_command(): starting 15621 1726882624.72464: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/AnsiballZ_setup.py && sleep 0' 15621 1726882624.72900: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.72903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.72906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882624.72908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882624.72962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882624.72969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882624.73057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882626.76174: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.51025390625, "5m": 0.60302734375, "15m": 0.322265625}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "06", "epo<<< 15621 1726882626.76181: stdout chunk (state=3): >>>ch": "1726882626", "epoch_int": "1726882626", "date": "2024-09-20", "time": "21:37:06", "iso8601_micro": "2024-09-21T01:37:06.410124Z", "iso8601": "2024-09-21T01:37:06Z", "iso8601_basic": "20240920T213706410124", "iso8601_basic_short": "20240920T213706", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3118, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 598, "free": 3118}, "nocache": {"free": 3501, "used": 215}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 770, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384147968, "block_size": 4096, "block_total": 64483404, "block_available": 61373083, "block_used": 3110321, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882626.78314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882626.78318: stdout chunk (state=3): >>><<< 15621 1726882626.78321: stderr chunk (state=3): >>><<< 15621 1726882626.78327: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.51025390625, "5m": 0.60302734375, "15m": 0.322265625}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "06", "epoch": "1726882626", "epoch_int": "1726882626", "date": "2024-09-20", "time": "21:37:06", "iso8601_micro": "2024-09-21T01:37:06.410124Z", "iso8601": "2024-09-21T01:37:06Z", "iso8601_basic": "20240920T213706410124", "iso8601_basic_short": "20240920T213706", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3118, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 598, "free": 3118}, "nocache": {"free": 3501, "used": 215}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 770, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384147968, "block_size": 4096, "block_total": 64483404, "block_available": 61373083, "block_used": 3110321, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882626.78649: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882626.78687: _low_level_execute_command(): starting 15621 1726882626.78697: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882624.6114485-17694-38402412771480/ > /dev/null 2>&1 && sleep 0' 15621 1726882626.79369: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882626.79383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882626.79398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882626.79414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882626.79534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882626.79538: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882626.79564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882626.79680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882626.81695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882626.81698: stdout chunk (state=3): >>><<< 15621 1726882626.81701: stderr chunk (state=3): >>><<< 15621 1726882626.81727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882626.81828: handler run complete 15621 1726882626.81887: variable 'ansible_facts' from source: unknown 15621 1726882626.82003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.82371: variable 'ansible_facts' from source: unknown 15621 1726882626.82470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.82708: attempt loop complete, returning result 15621 1726882626.82711: _execute() done 15621 1726882626.82714: dumping result to json 15621 1726882626.82716: done dumping result, returning 15621 1726882626.82719: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-0000000004e4] 15621 1726882626.82721: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004e4 15621 1726882626.83233: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004e4 15621 1726882626.83237: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882626.83584: no more pending results, returning what we have 15621 1726882626.83588: results queue empty 15621 1726882626.83589: checking for any_errors_fatal 15621 1726882626.83590: done checking for any_errors_fatal 15621 1726882626.83591: checking for max_fail_percentage 15621 1726882626.83593: done checking for max_fail_percentage 15621 1726882626.83594: checking to see if all hosts have failed and the running result is not ok 15621 1726882626.83595: done checking to see if all hosts have failed 15621 1726882626.83596: getting the remaining hosts for this loop 15621 1726882626.83599: done getting the remaining hosts for this loop 15621 1726882626.83602: getting the next task for host managed_node3 15621 1726882626.83608: done getting next task for host managed_node3 15621 1726882626.83609: ^ task is: TASK: meta (flush_handlers) 15621 1726882626.83611: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882626.83616: getting variables 15621 1726882626.83617: in VariableManager get_vars() 15621 1726882626.83639: Calling all_inventory to load vars for managed_node3 15621 1726882626.83641: Calling groups_inventory to load vars for managed_node3 15621 1726882626.83643: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882626.83652: Calling all_plugins_play to load vars for managed_node3 15621 1726882626.83654: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882626.83657: Calling groups_plugins_play to load vars for managed_node3 15621 1726882626.84594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.86054: done with get_vars() 15621 1726882626.86083: done getting variables 15621 1726882626.86157: in VariableManager get_vars() 15621 1726882626.86168: Calling all_inventory to load vars for managed_node3 15621 1726882626.86170: Calling groups_inventory to load vars for managed_node3 15621 1726882626.86176: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882626.86181: Calling all_plugins_play to load vars for managed_node3 15621 1726882626.86184: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882626.86187: Calling groups_plugins_play to load vars for managed_node3 15621 1726882626.87255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.88400: done with get_vars() 15621 1726882626.88421: done queuing things up, now waiting for results queue to drain 15621 1726882626.88424: results queue empty 15621 1726882626.88425: checking for any_errors_fatal 15621 1726882626.88429: done checking for any_errors_fatal 15621 1726882626.88429: checking for max_fail_percentage 15621 1726882626.88434: done checking for max_fail_percentage 15621 1726882626.88434: checking to see if all hosts have failed and the running result is not ok 15621 1726882626.88435: done checking to see if all hosts have failed 15621 1726882626.88436: getting the remaining hosts for this loop 15621 1726882626.88437: done getting the remaining hosts for this loop 15621 1726882626.88439: getting the next task for host managed_node3 15621 1726882626.88443: done getting next task for host managed_node3 15621 1726882626.88445: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 15621 1726882626.88446: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882626.88447: getting variables 15621 1726882626.88448: in VariableManager get_vars() 15621 1726882626.88454: Calling all_inventory to load vars for managed_node3 15621 1726882626.88456: Calling groups_inventory to load vars for managed_node3 15621 1726882626.88457: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882626.88462: Calling all_plugins_play to load vars for managed_node3 15621 1726882626.88463: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882626.88465: Calling groups_plugins_play to load vars for managed_node3 15621 1726882626.89756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.91466: done with get_vars() 15621 1726882626.91493: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 21:37:06 -0400 (0:00:02.359) 0:00:58.995 ****** 15621 1726882626.91595: entering _queue_task() for managed_node3/include_tasks 15621 1726882626.92115: worker is 1 (out of 1 available) 15621 1726882626.92132: exiting _queue_task() for managed_node3/include_tasks 15621 1726882626.92147: done queuing things up, now waiting for results queue to drain 15621 1726882626.92149: waiting for pending results... 15621 1726882626.92860: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' 15621 1726882626.93083: in run() - task 0affc7ec-ae25-af1a-5b92-000000000074 15621 1726882626.93088: variable 'ansible_search_path' from source: unknown 15621 1726882626.93323: calling self._execute() 15621 1726882626.93437: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882626.93441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882626.93445: variable 'omit' from source: magic vars 15621 1726882626.94083: variable 'ansible_distribution_major_version' from source: facts 15621 1726882626.94088: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882626.94091: _execute() done 15621 1726882626.94095: dumping result to json 15621 1726882626.94098: done dumping result, returning 15621 1726882626.94101: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' [0affc7ec-ae25-af1a-5b92-000000000074] 15621 1726882626.94146: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000074 15621 1726882626.94486: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000074 15621 1726882626.94489: WORKER PROCESS EXITING 15621 1726882626.94524: no more pending results, returning what we have 15621 1726882626.94529: in VariableManager get_vars() 15621 1726882626.94563: Calling all_inventory to load vars for managed_node3 15621 1726882626.94566: Calling groups_inventory to load vars for managed_node3 15621 1726882626.94569: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882626.94581: Calling all_plugins_play to load vars for managed_node3 15621 1726882626.94584: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882626.94588: Calling groups_plugins_play to load vars for managed_node3 15621 1726882626.96926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882626.98194: done with get_vars() 15621 1726882626.98211: variable 'ansible_search_path' from source: unknown 15621 1726882626.98226: we have included files to process 15621 1726882626.98227: generating all_blocks data 15621 1726882626.98228: done generating all_blocks data 15621 1726882626.98228: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15621 1726882626.98229: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15621 1726882626.98231: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15621 1726882626.98356: in VariableManager get_vars() 15621 1726882626.98368: done with get_vars() 15621 1726882626.98474: done processing included file 15621 1726882626.98477: iterating over new_blocks loaded from include file 15621 1726882626.98478: in VariableManager get_vars() 15621 1726882626.98500: done with get_vars() 15621 1726882626.98501: filtering new block on tags 15621 1726882626.98519: done filtering new block on tags 15621 1726882626.98523: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 15621 1726882626.98529: extending task lists for all hosts with included blocks 15621 1726882626.98586: done extending task lists 15621 1726882626.98588: done processing included files 15621 1726882626.98588: results queue empty 15621 1726882626.98589: checking for any_errors_fatal 15621 1726882626.98591: done checking for any_errors_fatal 15621 1726882626.98592: checking for max_fail_percentage 15621 1726882626.98593: done checking for max_fail_percentage 15621 1726882626.98593: checking to see if all hosts have failed and the running result is not ok 15621 1726882626.98594: done checking to see if all hosts have failed 15621 1726882626.98595: getting the remaining hosts for this loop 15621 1726882626.98596: done getting the remaining hosts for this loop 15621 1726882626.98599: getting the next task for host managed_node3 15621 1726882626.98602: done getting next task for host managed_node3 15621 1726882626.98605: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15621 1726882626.98607: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882626.98610: getting variables 15621 1726882626.98611: in VariableManager get_vars() 15621 1726882626.98620: Calling all_inventory to load vars for managed_node3 15621 1726882626.98624: Calling groups_inventory to load vars for managed_node3 15621 1726882626.98626: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882626.98632: Calling all_plugins_play to load vars for managed_node3 15621 1726882626.98635: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882626.98638: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.00025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.01182: done with get_vars() 15621 1726882627.01201: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:37:07 -0400 (0:00:00.096) 0:00:59.091 ****** 15621 1726882627.01267: entering _queue_task() for managed_node3/include_tasks 15621 1726882627.01553: worker is 1 (out of 1 available) 15621 1726882627.01567: exiting _queue_task() for managed_node3/include_tasks 15621 1726882627.01583: done queuing things up, now waiting for results queue to drain 15621 1726882627.01585: waiting for pending results... 15621 1726882627.01771: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 15621 1726882627.01851: in run() - task 0affc7ec-ae25-af1a-5b92-0000000004f5 15621 1726882627.01864: variable 'ansible_search_path' from source: unknown 15621 1726882627.01868: variable 'ansible_search_path' from source: unknown 15621 1726882627.01903: calling self._execute() 15621 1726882627.01980: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.01984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.01994: variable 'omit' from source: magic vars 15621 1726882627.02293: variable 'ansible_distribution_major_version' from source: facts 15621 1726882627.02304: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882627.02310: _execute() done 15621 1726882627.02313: dumping result to json 15621 1726882627.02319: done dumping result, returning 15621 1726882627.02327: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-af1a-5b92-0000000004f5] 15621 1726882627.02333: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004f5 15621 1726882627.02424: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004f5 15621 1726882627.02428: WORKER PROCESS EXITING 15621 1726882627.02459: no more pending results, returning what we have 15621 1726882627.02464: in VariableManager get_vars() 15621 1726882627.02504: Calling all_inventory to load vars for managed_node3 15621 1726882627.02508: Calling groups_inventory to load vars for managed_node3 15621 1726882627.02511: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882627.02529: Calling all_plugins_play to load vars for managed_node3 15621 1726882627.02532: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882627.02535: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.03596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.04750: done with get_vars() 15621 1726882627.04765: variable 'ansible_search_path' from source: unknown 15621 1726882627.04766: variable 'ansible_search_path' from source: unknown 15621 1726882627.04795: we have included files to process 15621 1726882627.04796: generating all_blocks data 15621 1726882627.04798: done generating all_blocks data 15621 1726882627.04799: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15621 1726882627.04800: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15621 1726882627.04802: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15621 1726882627.05565: done processing included file 15621 1726882627.05567: iterating over new_blocks loaded from include file 15621 1726882627.05568: in VariableManager get_vars() 15621 1726882627.05579: done with get_vars() 15621 1726882627.05581: filtering new block on tags 15621 1726882627.05597: done filtering new block on tags 15621 1726882627.05598: in VariableManager get_vars() 15621 1726882627.05606: done with get_vars() 15621 1726882627.05607: filtering new block on tags 15621 1726882627.05620: done filtering new block on tags 15621 1726882627.05621: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 15621 1726882627.05628: extending task lists for all hosts with included blocks 15621 1726882627.05696: done extending task lists 15621 1726882627.05697: done processing included files 15621 1726882627.05697: results queue empty 15621 1726882627.05698: checking for any_errors_fatal 15621 1726882627.05700: done checking for any_errors_fatal 15621 1726882627.05701: checking for max_fail_percentage 15621 1726882627.05702: done checking for max_fail_percentage 15621 1726882627.05702: checking to see if all hosts have failed and the running result is not ok 15621 1726882627.05703: done checking to see if all hosts have failed 15621 1726882627.05703: getting the remaining hosts for this loop 15621 1726882627.05704: done getting the remaining hosts for this loop 15621 1726882627.05706: getting the next task for host managed_node3 15621 1726882627.05709: done getting next task for host managed_node3 15621 1726882627.05710: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15621 1726882627.05713: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882627.05714: getting variables 15621 1726882627.05715: in VariableManager get_vars() 15621 1726882627.05759: Calling all_inventory to load vars for managed_node3 15621 1726882627.05761: Calling groups_inventory to load vars for managed_node3 15621 1726882627.05763: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882627.05768: Calling all_plugins_play to load vars for managed_node3 15621 1726882627.05770: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882627.05772: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.06558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.07755: done with get_vars() 15621 1726882627.07771: done getting variables 15621 1726882627.07804: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:37:07 -0400 (0:00:00.065) 0:00:59.157 ****** 15621 1726882627.07829: entering _queue_task() for managed_node3/set_fact 15621 1726882627.08113: worker is 1 (out of 1 available) 15621 1726882627.08128: exiting _queue_task() for managed_node3/set_fact 15621 1726882627.08143: done queuing things up, now waiting for results queue to drain 15621 1726882627.08145: waiting for pending results... 15621 1726882627.08330: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 15621 1726882627.08416: in run() - task 0affc7ec-ae25-af1a-5b92-000000000502 15621 1726882627.08431: variable 'ansible_search_path' from source: unknown 15621 1726882627.08435: variable 'ansible_search_path' from source: unknown 15621 1726882627.08466: calling self._execute() 15621 1726882627.08537: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.08543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.08553: variable 'omit' from source: magic vars 15621 1726882627.08847: variable 'ansible_distribution_major_version' from source: facts 15621 1726882627.08859: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882627.08865: variable 'omit' from source: magic vars 15621 1726882627.08900: variable 'omit' from source: magic vars 15621 1726882627.08932: variable 'omit' from source: magic vars 15621 1726882627.08965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882627.08997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882627.09013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882627.09030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.09041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.09067: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882627.09070: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.09075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.09151: Set connection var ansible_connection to ssh 15621 1726882627.09158: Set connection var ansible_shell_executable to /bin/sh 15621 1726882627.09161: Set connection var ansible_timeout to 10 15621 1726882627.09164: Set connection var ansible_shell_type to sh 15621 1726882627.09170: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882627.09178: Set connection var ansible_pipelining to False 15621 1726882627.09195: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.09198: variable 'ansible_connection' from source: unknown 15621 1726882627.09201: variable 'ansible_module_compression' from source: unknown 15621 1726882627.09203: variable 'ansible_shell_type' from source: unknown 15621 1726882627.09206: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.09208: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.09213: variable 'ansible_pipelining' from source: unknown 15621 1726882627.09215: variable 'ansible_timeout' from source: unknown 15621 1726882627.09220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.09333: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882627.09342: variable 'omit' from source: magic vars 15621 1726882627.09348: starting attempt loop 15621 1726882627.09352: running the handler 15621 1726882627.09364: handler run complete 15621 1726882627.09380: attempt loop complete, returning result 15621 1726882627.09383: _execute() done 15621 1726882627.09385: dumping result to json 15621 1726882627.09388: done dumping result, returning 15621 1726882627.09390: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-af1a-5b92-000000000502] 15621 1726882627.09395: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000502 15621 1726882627.09482: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000502 15621 1726882627.09486: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15621 1726882627.09544: no more pending results, returning what we have 15621 1726882627.09548: results queue empty 15621 1726882627.09549: checking for any_errors_fatal 15621 1726882627.09550: done checking for any_errors_fatal 15621 1726882627.09551: checking for max_fail_percentage 15621 1726882627.09553: done checking for max_fail_percentage 15621 1726882627.09554: checking to see if all hosts have failed and the running result is not ok 15621 1726882627.09555: done checking to see if all hosts have failed 15621 1726882627.09556: getting the remaining hosts for this loop 15621 1726882627.09557: done getting the remaining hosts for this loop 15621 1726882627.09562: getting the next task for host managed_node3 15621 1726882627.09569: done getting next task for host managed_node3 15621 1726882627.09572: ^ task is: TASK: Stat profile file 15621 1726882627.09578: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882627.09583: getting variables 15621 1726882627.09585: in VariableManager get_vars() 15621 1726882627.09614: Calling all_inventory to load vars for managed_node3 15621 1726882627.09617: Calling groups_inventory to load vars for managed_node3 15621 1726882627.09621: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882627.09634: Calling all_plugins_play to load vars for managed_node3 15621 1726882627.09637: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882627.09640: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.10607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.11772: done with get_vars() 15621 1726882627.11795: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:37:07 -0400 (0:00:00.040) 0:00:59.197 ****** 15621 1726882627.11869: entering _queue_task() for managed_node3/stat 15621 1726882627.12148: worker is 1 (out of 1 available) 15621 1726882627.12164: exiting _queue_task() for managed_node3/stat 15621 1726882627.12180: done queuing things up, now waiting for results queue to drain 15621 1726882627.12182: waiting for pending results... 15621 1726882627.12372: running TaskExecutor() for managed_node3/TASK: Stat profile file 15621 1726882627.12461: in run() - task 0affc7ec-ae25-af1a-5b92-000000000503 15621 1726882627.12470: variable 'ansible_search_path' from source: unknown 15621 1726882627.12476: variable 'ansible_search_path' from source: unknown 15621 1726882627.12505: calling self._execute() 15621 1726882627.12579: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.12583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.12592: variable 'omit' from source: magic vars 15621 1726882627.12889: variable 'ansible_distribution_major_version' from source: facts 15621 1726882627.12901: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882627.12910: variable 'omit' from source: magic vars 15621 1726882627.12952: variable 'omit' from source: magic vars 15621 1726882627.13031: variable 'profile' from source: include params 15621 1726882627.13035: variable 'interface' from source: set_fact 15621 1726882627.13093: variable 'interface' from source: set_fact 15621 1726882627.13107: variable 'omit' from source: magic vars 15621 1726882627.13148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882627.13182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882627.13199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882627.13214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.13230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.13254: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882627.13257: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.13261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.13337: Set connection var ansible_connection to ssh 15621 1726882627.13347: Set connection var ansible_shell_executable to /bin/sh 15621 1726882627.13350: Set connection var ansible_timeout to 10 15621 1726882627.13353: Set connection var ansible_shell_type to sh 15621 1726882627.13358: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882627.13364: Set connection var ansible_pipelining to False 15621 1726882627.13387: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.13390: variable 'ansible_connection' from source: unknown 15621 1726882627.13393: variable 'ansible_module_compression' from source: unknown 15621 1726882627.13395: variable 'ansible_shell_type' from source: unknown 15621 1726882627.13399: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.13401: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.13404: variable 'ansible_pipelining' from source: unknown 15621 1726882627.13407: variable 'ansible_timeout' from source: unknown 15621 1726882627.13412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.13576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882627.13587: variable 'omit' from source: magic vars 15621 1726882627.13594: starting attempt loop 15621 1726882627.13597: running the handler 15621 1726882627.13608: _low_level_execute_command(): starting 15621 1726882627.13616: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882627.14171: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.14178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.14183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882627.14187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.14228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.14232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.14237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.14334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.16093: stdout chunk (state=3): >>>/root <<< 15621 1726882627.16195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.16255: stderr chunk (state=3): >>><<< 15621 1726882627.16259: stdout chunk (state=3): >>><<< 15621 1726882627.16281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.16292: _low_level_execute_command(): starting 15621 1726882627.16299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024 `" && echo ansible-tmp-1726882627.1627877-17766-184130635607024="` echo /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024 `" ) && sleep 0' 15621 1726882627.16779: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.16783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.16793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.16795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.16843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.16847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.16942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.18916: stdout chunk (state=3): >>>ansible-tmp-1726882627.1627877-17766-184130635607024=/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024 <<< 15621 1726882627.19029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.19088: stderr chunk (state=3): >>><<< 15621 1726882627.19091: stdout chunk (state=3): >>><<< 15621 1726882627.19107: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882627.1627877-17766-184130635607024=/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.19153: variable 'ansible_module_compression' from source: unknown 15621 1726882627.19202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15621 1726882627.19236: variable 'ansible_facts' from source: unknown 15621 1726882627.19304: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py 15621 1726882627.19414: Sending initial data 15621 1726882627.19418: Sent initial data (153 bytes) 15621 1726882627.19909: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.19913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882627.19916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.19918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882627.19921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882627.19925: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.19974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.19978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.19980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.20069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.21672: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882627.21751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882627.21837: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpom87gcdm /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py <<< 15621 1726882627.21845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py" <<< 15621 1726882627.21923: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpom87gcdm" to remote "/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py" <<< 15621 1726882627.21926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py" <<< 15621 1726882627.22651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.22727: stderr chunk (state=3): >>><<< 15621 1726882627.22731: stdout chunk (state=3): >>><<< 15621 1726882627.22756: done transferring module to remote 15621 1726882627.22765: _low_level_execute_command(): starting 15621 1726882627.22769: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/ /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py && sleep 0' 15621 1726882627.23252: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882627.23255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882627.23258: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.23261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.23267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.23320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.23329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.23409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.25239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.25297: stderr chunk (state=3): >>><<< 15621 1726882627.25300: stdout chunk (state=3): >>><<< 15621 1726882627.25315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.25318: _low_level_execute_command(): starting 15621 1726882627.25325: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/AnsiballZ_stat.py && sleep 0' 15621 1726882627.25810: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.25814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.25816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.25819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.25882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.25885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.25973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.42312: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15621 1726882627.43620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882627.43681: stderr chunk (state=3): >>><<< 15621 1726882627.43684: stdout chunk (state=3): >>><<< 15621 1726882627.43698: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882627.43725: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882627.43734: _low_level_execute_command(): starting 15621 1726882627.43740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882627.1627877-17766-184130635607024/ > /dev/null 2>&1 && sleep 0' 15621 1726882627.44233: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.44237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.44239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.44254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.44307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.44310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.44314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.44397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.53597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.53659: stderr chunk (state=3): >>><<< 15621 1726882627.53663: stdout chunk (state=3): >>><<< 15621 1726882627.53676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.53686: handler run complete 15621 1726882627.53704: attempt loop complete, returning result 15621 1726882627.53707: _execute() done 15621 1726882627.53709: dumping result to json 15621 1726882627.53716: done dumping result, returning 15621 1726882627.53726: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affc7ec-ae25-af1a-5b92-000000000503] 15621 1726882627.53732: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000503 15621 1726882627.53840: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000503 15621 1726882627.53844: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15621 1726882627.53905: no more pending results, returning what we have 15621 1726882627.53909: results queue empty 15621 1726882627.53910: checking for any_errors_fatal 15621 1726882627.53918: done checking for any_errors_fatal 15621 1726882627.53919: checking for max_fail_percentage 15621 1726882627.53920: done checking for max_fail_percentage 15621 1726882627.53921: checking to see if all hosts have failed and the running result is not ok 15621 1726882627.53930: done checking to see if all hosts have failed 15621 1726882627.53931: getting the remaining hosts for this loop 15621 1726882627.53932: done getting the remaining hosts for this loop 15621 1726882627.53937: getting the next task for host managed_node3 15621 1726882627.53944: done getting next task for host managed_node3 15621 1726882627.53947: ^ task is: TASK: Set NM profile exist flag based on the profile files 15621 1726882627.53951: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882627.53955: getting variables 15621 1726882627.53957: in VariableManager get_vars() 15621 1726882627.53989: Calling all_inventory to load vars for managed_node3 15621 1726882627.53992: Calling groups_inventory to load vars for managed_node3 15621 1726882627.53996: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882627.54008: Calling all_plugins_play to load vars for managed_node3 15621 1726882627.54010: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882627.54013: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.58357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.59503: done with get_vars() 15621 1726882627.59525: done getting variables 15621 1726882627.59566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:37:07 -0400 (0:00:00.477) 0:00:59.675 ****** 15621 1726882627.59588: entering _queue_task() for managed_node3/set_fact 15621 1726882627.59877: worker is 1 (out of 1 available) 15621 1726882627.59890: exiting _queue_task() for managed_node3/set_fact 15621 1726882627.59903: done queuing things up, now waiting for results queue to drain 15621 1726882627.59905: waiting for pending results... 15621 1726882627.60101: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 15621 1726882627.60212: in run() - task 0affc7ec-ae25-af1a-5b92-000000000504 15621 1726882627.60225: variable 'ansible_search_path' from source: unknown 15621 1726882627.60229: variable 'ansible_search_path' from source: unknown 15621 1726882627.60263: calling self._execute() 15621 1726882627.60339: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.60345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.60355: variable 'omit' from source: magic vars 15621 1726882627.60662: variable 'ansible_distribution_major_version' from source: facts 15621 1726882627.60674: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882627.60770: variable 'profile_stat' from source: set_fact 15621 1726882627.60786: Evaluated conditional (profile_stat.stat.exists): False 15621 1726882627.60790: when evaluation is False, skipping this task 15621 1726882627.60793: _execute() done 15621 1726882627.60798: dumping result to json 15621 1726882627.60800: done dumping result, returning 15621 1726882627.60804: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-af1a-5b92-000000000504] 15621 1726882627.60814: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000504 15621 1726882627.60902: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000504 15621 1726882627.60905: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15621 1726882627.60965: no more pending results, returning what we have 15621 1726882627.60970: results queue empty 15621 1726882627.60971: checking for any_errors_fatal 15621 1726882627.60984: done checking for any_errors_fatal 15621 1726882627.60985: checking for max_fail_percentage 15621 1726882627.60986: done checking for max_fail_percentage 15621 1726882627.60987: checking to see if all hosts have failed and the running result is not ok 15621 1726882627.60988: done checking to see if all hosts have failed 15621 1726882627.60989: getting the remaining hosts for this loop 15621 1726882627.60990: done getting the remaining hosts for this loop 15621 1726882627.60994: getting the next task for host managed_node3 15621 1726882627.61000: done getting next task for host managed_node3 15621 1726882627.61003: ^ task is: TASK: Get NM profile info 15621 1726882627.61006: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882627.61011: getting variables 15621 1726882627.61012: in VariableManager get_vars() 15621 1726882627.61044: Calling all_inventory to load vars for managed_node3 15621 1726882627.61047: Calling groups_inventory to load vars for managed_node3 15621 1726882627.61051: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882627.61062: Calling all_plugins_play to load vars for managed_node3 15621 1726882627.61064: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882627.61067: Calling groups_plugins_play to load vars for managed_node3 15621 1726882627.62032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882627.63198: done with get_vars() 15621 1726882627.63220: done getting variables 15621 1726882627.63298: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:37:07 -0400 (0:00:00.037) 0:00:59.712 ****** 15621 1726882627.63328: entering _queue_task() for managed_node3/shell 15621 1726882627.63330: Creating lock for shell 15621 1726882627.63612: worker is 1 (out of 1 available) 15621 1726882627.63628: exiting _queue_task() for managed_node3/shell 15621 1726882627.63642: done queuing things up, now waiting for results queue to drain 15621 1726882627.63644: waiting for pending results... 15621 1726882627.63844: running TaskExecutor() for managed_node3/TASK: Get NM profile info 15621 1726882627.63938: in run() - task 0affc7ec-ae25-af1a-5b92-000000000505 15621 1726882627.63950: variable 'ansible_search_path' from source: unknown 15621 1726882627.63954: variable 'ansible_search_path' from source: unknown 15621 1726882627.63992: calling self._execute() 15621 1726882627.64068: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.64075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.64091: variable 'omit' from source: magic vars 15621 1726882627.64392: variable 'ansible_distribution_major_version' from source: facts 15621 1726882627.64404: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882627.64412: variable 'omit' from source: magic vars 15621 1726882627.64452: variable 'omit' from source: magic vars 15621 1726882627.64533: variable 'profile' from source: include params 15621 1726882627.64538: variable 'interface' from source: set_fact 15621 1726882627.64590: variable 'interface' from source: set_fact 15621 1726882627.64605: variable 'omit' from source: magic vars 15621 1726882627.64644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882627.64676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882627.64696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882627.64710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.64720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882627.64750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882627.64754: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.64759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.64833: Set connection var ansible_connection to ssh 15621 1726882627.64841: Set connection var ansible_shell_executable to /bin/sh 15621 1726882627.64847: Set connection var ansible_timeout to 10 15621 1726882627.64851: Set connection var ansible_shell_type to sh 15621 1726882627.64854: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882627.64862: Set connection var ansible_pipelining to False 15621 1726882627.64885: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.64889: variable 'ansible_connection' from source: unknown 15621 1726882627.64892: variable 'ansible_module_compression' from source: unknown 15621 1726882627.64895: variable 'ansible_shell_type' from source: unknown 15621 1726882627.64897: variable 'ansible_shell_executable' from source: unknown 15621 1726882627.64900: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882627.64902: variable 'ansible_pipelining' from source: unknown 15621 1726882627.64905: variable 'ansible_timeout' from source: unknown 15621 1726882627.64911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882627.65032: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882627.65042: variable 'omit' from source: magic vars 15621 1726882627.65047: starting attempt loop 15621 1726882627.65050: running the handler 15621 1726882627.65058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882627.65078: _low_level_execute_command(): starting 15621 1726882627.65081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882627.65641: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.65646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.65649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.65651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.65711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.65717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.65719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.65805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.67559: stdout chunk (state=3): >>>/root <<< 15621 1726882627.67714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.67720: stdout chunk (state=3): >>><<< 15621 1726882627.67730: stderr chunk (state=3): >>><<< 15621 1726882627.67752: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.67764: _low_level_execute_command(): starting 15621 1726882627.67770: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907 `" && echo ansible-tmp-1726882627.677518-17775-100403536592907="` echo /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907 `" ) && sleep 0' 15621 1726882627.68263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882627.68266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882627.68270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.68272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.68333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.68338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.68340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.68423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.70398: stdout chunk (state=3): >>>ansible-tmp-1726882627.677518-17775-100403536592907=/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907 <<< 15621 1726882627.70541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.70564: stderr chunk (state=3): >>><<< 15621 1726882627.70567: stdout chunk (state=3): >>><<< 15621 1726882627.70586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882627.677518-17775-100403536592907=/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.70613: variable 'ansible_module_compression' from source: unknown 15621 1726882627.70663: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882627.70698: variable 'ansible_facts' from source: unknown 15621 1726882627.70753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py 15621 1726882627.70864: Sending initial data 15621 1726882627.70867: Sent initial data (155 bytes) 15621 1726882627.71348: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.71352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.71354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.71356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.71411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.71421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.71426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.71497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.73106: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882627.73189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882627.73278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpfc38knzc /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py <<< 15621 1726882627.73281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py" <<< 15621 1726882627.73357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpfc38knzc" to remote "/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py" <<< 15621 1726882627.73362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py" <<< 15621 1726882627.74093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.74165: stderr chunk (state=3): >>><<< 15621 1726882627.74168: stdout chunk (state=3): >>><<< 15621 1726882627.74187: done transferring module to remote 15621 1726882627.74198: _low_level_execute_command(): starting 15621 1726882627.74204: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/ /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py && sleep 0' 15621 1726882627.74687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882627.74690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882627.74697: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15621 1726882627.74699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882627.74702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.74753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.74758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.74760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.74843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.76676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882627.76720: stderr chunk (state=3): >>><<< 15621 1726882627.76726: stdout chunk (state=3): >>><<< 15621 1726882627.76739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882627.76749: _low_level_execute_command(): starting 15621 1726882627.76751: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/AnsiballZ_command.py && sleep 0' 15621 1726882627.77210: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882627.77214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.77217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882627.77219: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882627.77271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.77279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.77280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.77366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882627.95592: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:37:07.936676", "end": "2024-09-20 21:37:07.954029", "delta": "0:00:00.017353", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882627.97058: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.226 closed. <<< 15621 1726882627.97115: stderr chunk (state=3): >>><<< 15621 1726882627.97118: stdout chunk (state=3): >>><<< 15621 1726882627.97139: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:37:07.936676", "end": "2024-09-20 21:37:07.954029", "delta": "0:00:00.017353", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.226 closed. 15621 1726882627.97277: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882627.97285: _low_level_execute_command(): starting 15621 1726882627.97288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882627.677518-17775-100403536592907/ > /dev/null 2>&1 && sleep 0' 15621 1726882627.98518: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882627.98549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882627.98580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882627.98795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.00884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.00926: stderr chunk (state=3): >>><<< 15621 1726882628.00930: stdout chunk (state=3): >>><<< 15621 1726882628.00949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882628.01334: handler run complete 15621 1726882628.01338: Evaluated conditional (False): False 15621 1726882628.01341: attempt loop complete, returning result 15621 1726882628.01343: _execute() done 15621 1726882628.01345: dumping result to json 15621 1726882628.01348: done dumping result, returning 15621 1726882628.01350: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affc7ec-ae25-af1a-5b92-000000000505] 15621 1726882628.01352: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000505 fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.017353", "end": "2024-09-20 21:37:07.954029", "rc": 1, "start": "2024-09-20 21:37:07.936676" } MSG: non-zero return code ...ignoring 15621 1726882628.01537: no more pending results, returning what we have 15621 1726882628.01541: results queue empty 15621 1726882628.01542: checking for any_errors_fatal 15621 1726882628.01550: done checking for any_errors_fatal 15621 1726882628.01551: checking for max_fail_percentage 15621 1726882628.01552: done checking for max_fail_percentage 15621 1726882628.01553: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.01555: done checking to see if all hosts have failed 15621 1726882628.01556: getting the remaining hosts for this loop 15621 1726882628.01557: done getting the remaining hosts for this loop 15621 1726882628.01562: getting the next task for host managed_node3 15621 1726882628.01569: done getting next task for host managed_node3 15621 1726882628.01572: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15621 1726882628.01579: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.01585: getting variables 15621 1726882628.01586: in VariableManager get_vars() 15621 1726882628.01732: Calling all_inventory to load vars for managed_node3 15621 1726882628.01738: Calling groups_inventory to load vars for managed_node3 15621 1726882628.01742: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.01759: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.01762: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.01766: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.02518: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000505 15621 1726882628.02524: WORKER PROCESS EXITING 15621 1726882628.04460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.06707: done with get_vars() 15621 1726882628.06737: done getting variables 15621 1726882628.06807: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:37:08 -0400 (0:00:00.435) 0:01:00.147 ****** 15621 1726882628.06841: entering _queue_task() for managed_node3/set_fact 15621 1726882628.07367: worker is 1 (out of 1 available) 15621 1726882628.07384: exiting _queue_task() for managed_node3/set_fact 15621 1726882628.07396: done queuing things up, now waiting for results queue to drain 15621 1726882628.07398: waiting for pending results... 15621 1726882628.07665: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15621 1726882628.07835: in run() - task 0affc7ec-ae25-af1a-5b92-000000000506 15621 1726882628.07862: variable 'ansible_search_path' from source: unknown 15621 1726882628.07871: variable 'ansible_search_path' from source: unknown 15621 1726882628.07930: calling self._execute() 15621 1726882628.08045: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.08064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.08109: variable 'omit' from source: magic vars 15621 1726882628.08563: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.08586: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.08766: variable 'nm_profile_exists' from source: set_fact 15621 1726882628.08781: Evaluated conditional (nm_profile_exists.rc == 0): False 15621 1726882628.08790: when evaluation is False, skipping this task 15621 1726882628.08878: _execute() done 15621 1726882628.08882: dumping result to json 15621 1726882628.08886: done dumping result, returning 15621 1726882628.08889: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-af1a-5b92-000000000506] 15621 1726882628.08892: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000506 15621 1726882628.08970: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000506 15621 1726882628.09024: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15621 1726882628.09084: no more pending results, returning what we have 15621 1726882628.09094: results queue empty 15621 1726882628.09095: checking for any_errors_fatal 15621 1726882628.09106: done checking for any_errors_fatal 15621 1726882628.09107: checking for max_fail_percentage 15621 1726882628.09109: done checking for max_fail_percentage 15621 1726882628.09110: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.09112: done checking to see if all hosts have failed 15621 1726882628.09113: getting the remaining hosts for this loop 15621 1726882628.09114: done getting the remaining hosts for this loop 15621 1726882628.09119: getting the next task for host managed_node3 15621 1726882628.09133: done getting next task for host managed_node3 15621 1726882628.09136: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15621 1726882628.09141: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.09146: getting variables 15621 1726882628.09148: in VariableManager get_vars() 15621 1726882628.09187: Calling all_inventory to load vars for managed_node3 15621 1726882628.09190: Calling groups_inventory to load vars for managed_node3 15621 1726882628.09310: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.09329: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.09333: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.09337: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.11458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.12869: done with get_vars() 15621 1726882628.12899: done getting variables 15621 1726882628.12952: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.13056: variable 'profile' from source: include params 15621 1726882628.13060: variable 'interface' from source: set_fact 15621 1726882628.13110: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:37:08 -0400 (0:00:00.062) 0:01:00.210 ****** 15621 1726882628.13138: entering _queue_task() for managed_node3/command 15621 1726882628.13433: worker is 1 (out of 1 available) 15621 1726882628.13449: exiting _queue_task() for managed_node3/command 15621 1726882628.13461: done queuing things up, now waiting for results queue to drain 15621 1726882628.13463: waiting for pending results... 15621 1726882628.13657: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-lsr27 15621 1726882628.13783: in run() - task 0affc7ec-ae25-af1a-5b92-000000000508 15621 1726882628.13836: variable 'ansible_search_path' from source: unknown 15621 1726882628.13839: variable 'ansible_search_path' from source: unknown 15621 1726882628.13864: calling self._execute() 15621 1726882628.14011: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.14031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.14048: variable 'omit' from source: magic vars 15621 1726882628.14497: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.14516: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.14724: variable 'profile_stat' from source: set_fact 15621 1726882628.14777: Evaluated conditional (profile_stat.stat.exists): False 15621 1726882628.14790: when evaluation is False, skipping this task 15621 1726882628.14800: _execute() done 15621 1726882628.14865: dumping result to json 15621 1726882628.14868: done dumping result, returning 15621 1726882628.14871: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0affc7ec-ae25-af1a-5b92-000000000508] 15621 1726882628.14874: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000508 15621 1726882628.14984: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000508 15621 1726882628.14987: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15621 1726882628.15059: no more pending results, returning what we have 15621 1726882628.15063: results queue empty 15621 1726882628.15064: checking for any_errors_fatal 15621 1726882628.15076: done checking for any_errors_fatal 15621 1726882628.15077: checking for max_fail_percentage 15621 1726882628.15079: done checking for max_fail_percentage 15621 1726882628.15080: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.15082: done checking to see if all hosts have failed 15621 1726882628.15082: getting the remaining hosts for this loop 15621 1726882628.15084: done getting the remaining hosts for this loop 15621 1726882628.15092: getting the next task for host managed_node3 15621 1726882628.15100: done getting next task for host managed_node3 15621 1726882628.15103: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15621 1726882628.15107: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.15111: getting variables 15621 1726882628.15112: in VariableManager get_vars() 15621 1726882628.15145: Calling all_inventory to load vars for managed_node3 15621 1726882628.15148: Calling groups_inventory to load vars for managed_node3 15621 1726882628.15152: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.15163: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.15166: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.15169: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.16168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.17682: done with get_vars() 15621 1726882628.17714: done getting variables 15621 1726882628.17788: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.17919: variable 'profile' from source: include params 15621 1726882628.17926: variable 'interface' from source: set_fact 15621 1726882628.17997: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:37:08 -0400 (0:00:00.048) 0:01:00.259 ****** 15621 1726882628.18036: entering _queue_task() for managed_node3/set_fact 15621 1726882628.18685: worker is 1 (out of 1 available) 15621 1726882628.18695: exiting _queue_task() for managed_node3/set_fact 15621 1726882628.18706: done queuing things up, now waiting for results queue to drain 15621 1726882628.18707: waiting for pending results... 15621 1726882628.18749: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-lsr27 15621 1726882628.18843: in run() - task 0affc7ec-ae25-af1a-5b92-000000000509 15621 1726882628.18857: variable 'ansible_search_path' from source: unknown 15621 1726882628.18868: variable 'ansible_search_path' from source: unknown 15621 1726882628.18898: calling self._execute() 15621 1726882628.19027: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.19029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.19032: variable 'omit' from source: magic vars 15621 1726882628.19311: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.19324: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.19419: variable 'profile_stat' from source: set_fact 15621 1726882628.19433: Evaluated conditional (profile_stat.stat.exists): False 15621 1726882628.19436: when evaluation is False, skipping this task 15621 1726882628.19439: _execute() done 15621 1726882628.19444: dumping result to json 15621 1726882628.19446: done dumping result, returning 15621 1726882628.19454: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0affc7ec-ae25-af1a-5b92-000000000509] 15621 1726882628.19460: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000509 15621 1726882628.19556: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000509 15621 1726882628.19559: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15621 1726882628.19641: no more pending results, returning what we have 15621 1726882628.19645: results queue empty 15621 1726882628.19646: checking for any_errors_fatal 15621 1726882628.19654: done checking for any_errors_fatal 15621 1726882628.19654: checking for max_fail_percentage 15621 1726882628.19656: done checking for max_fail_percentage 15621 1726882628.19656: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.19657: done checking to see if all hosts have failed 15621 1726882628.19658: getting the remaining hosts for this loop 15621 1726882628.19660: done getting the remaining hosts for this loop 15621 1726882628.19664: getting the next task for host managed_node3 15621 1726882628.19672: done getting next task for host managed_node3 15621 1726882628.19675: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15621 1726882628.19679: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.19682: getting variables 15621 1726882628.19684: in VariableManager get_vars() 15621 1726882628.19713: Calling all_inventory to load vars for managed_node3 15621 1726882628.19715: Calling groups_inventory to load vars for managed_node3 15621 1726882628.19719: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.19732: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.19734: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.19737: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.21262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.23258: done with get_vars() 15621 1726882628.23292: done getting variables 15621 1726882628.23353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.23453: variable 'profile' from source: include params 15621 1726882628.23456: variable 'interface' from source: set_fact 15621 1726882628.23506: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:37:08 -0400 (0:00:00.054) 0:01:00.314 ****** 15621 1726882628.23535: entering _queue_task() for managed_node3/command 15621 1726882628.23827: worker is 1 (out of 1 available) 15621 1726882628.23843: exiting _queue_task() for managed_node3/command 15621 1726882628.23855: done queuing things up, now waiting for results queue to drain 15621 1726882628.23857: waiting for pending results... 15621 1726882628.24043: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-lsr27 15621 1726882628.24140: in run() - task 0affc7ec-ae25-af1a-5b92-00000000050a 15621 1726882628.24153: variable 'ansible_search_path' from source: unknown 15621 1726882628.24156: variable 'ansible_search_path' from source: unknown 15621 1726882628.24195: calling self._execute() 15621 1726882628.24275: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.24279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.24288: variable 'omit' from source: magic vars 15621 1726882628.24590: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.24600: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.24691: variable 'profile_stat' from source: set_fact 15621 1726882628.24705: Evaluated conditional (profile_stat.stat.exists): False 15621 1726882628.24708: when evaluation is False, skipping this task 15621 1726882628.24711: _execute() done 15621 1726882628.24715: dumping result to json 15621 1726882628.24717: done dumping result, returning 15621 1726882628.24742: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-lsr27 [0affc7ec-ae25-af1a-5b92-00000000050a] 15621 1726882628.24747: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000050a 15621 1726882628.24826: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000050a 15621 1726882628.24830: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15621 1726882628.24899: no more pending results, returning what we have 15621 1726882628.24903: results queue empty 15621 1726882628.24904: checking for any_errors_fatal 15621 1726882628.24915: done checking for any_errors_fatal 15621 1726882628.24915: checking for max_fail_percentage 15621 1726882628.24917: done checking for max_fail_percentage 15621 1726882628.24917: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.24919: done checking to see if all hosts have failed 15621 1726882628.24919: getting the remaining hosts for this loop 15621 1726882628.24921: done getting the remaining hosts for this loop 15621 1726882628.24927: getting the next task for host managed_node3 15621 1726882628.24934: done getting next task for host managed_node3 15621 1726882628.24937: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15621 1726882628.24941: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.24946: getting variables 15621 1726882628.24947: in VariableManager get_vars() 15621 1726882628.24980: Calling all_inventory to load vars for managed_node3 15621 1726882628.24983: Calling groups_inventory to load vars for managed_node3 15621 1726882628.24987: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.24999: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.25001: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.25004: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.26750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.27915: done with get_vars() 15621 1726882628.27939: done getting variables 15621 1726882628.27993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.28089: variable 'profile' from source: include params 15621 1726882628.28093: variable 'interface' from source: set_fact 15621 1726882628.28138: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:37:08 -0400 (0:00:00.046) 0:01:00.360 ****** 15621 1726882628.28164: entering _queue_task() for managed_node3/set_fact 15621 1726882628.28458: worker is 1 (out of 1 available) 15621 1726882628.28474: exiting _queue_task() for managed_node3/set_fact 15621 1726882628.28488: done queuing things up, now waiting for results queue to drain 15621 1726882628.28489: waiting for pending results... 15621 1726882628.28683: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-lsr27 15621 1726882628.28783: in run() - task 0affc7ec-ae25-af1a-5b92-00000000050b 15621 1726882628.28797: variable 'ansible_search_path' from source: unknown 15621 1726882628.28801: variable 'ansible_search_path' from source: unknown 15621 1726882628.28839: calling self._execute() 15621 1726882628.28914: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.28919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.28930: variable 'omit' from source: magic vars 15621 1726882628.29232: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.29242: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.29336: variable 'profile_stat' from source: set_fact 15621 1726882628.29349: Evaluated conditional (profile_stat.stat.exists): False 15621 1726882628.29352: when evaluation is False, skipping this task 15621 1726882628.29355: _execute() done 15621 1726882628.29358: dumping result to json 15621 1726882628.29362: done dumping result, returning 15621 1726882628.29371: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0affc7ec-ae25-af1a-5b92-00000000050b] 15621 1726882628.29382: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000050b 15621 1726882628.29472: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000050b 15621 1726882628.29476: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15621 1726882628.29533: no more pending results, returning what we have 15621 1726882628.29537: results queue empty 15621 1726882628.29538: checking for any_errors_fatal 15621 1726882628.29543: done checking for any_errors_fatal 15621 1726882628.29544: checking for max_fail_percentage 15621 1726882628.29546: done checking for max_fail_percentage 15621 1726882628.29546: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.29548: done checking to see if all hosts have failed 15621 1726882628.29548: getting the remaining hosts for this loop 15621 1726882628.29550: done getting the remaining hosts for this loop 15621 1726882628.29554: getting the next task for host managed_node3 15621 1726882628.29563: done getting next task for host managed_node3 15621 1726882628.29567: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15621 1726882628.29571: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.29575: getting variables 15621 1726882628.29577: in VariableManager get_vars() 15621 1726882628.29610: Calling all_inventory to load vars for managed_node3 15621 1726882628.29612: Calling groups_inventory to load vars for managed_node3 15621 1726882628.29616: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.29631: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.29634: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.29637: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.30640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.31811: done with get_vars() 15621 1726882628.31840: done getting variables 15621 1726882628.31893: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.31992: variable 'profile' from source: include params 15621 1726882628.31995: variable 'interface' from source: set_fact 15621 1726882628.32044: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:37:08 -0400 (0:00:00.039) 0:01:00.399 ****** 15621 1726882628.32070: entering _queue_task() for managed_node3/assert 15621 1726882628.32364: worker is 1 (out of 1 available) 15621 1726882628.32379: exiting _queue_task() for managed_node3/assert 15621 1726882628.32392: done queuing things up, now waiting for results queue to drain 15621 1726882628.32394: waiting for pending results... 15621 1726882628.32592: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'lsr27' 15621 1726882628.32672: in run() - task 0affc7ec-ae25-af1a-5b92-0000000004f6 15621 1726882628.32688: variable 'ansible_search_path' from source: unknown 15621 1726882628.32692: variable 'ansible_search_path' from source: unknown 15621 1726882628.32726: calling self._execute() 15621 1726882628.32808: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.32814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.32825: variable 'omit' from source: magic vars 15621 1726882628.33131: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.33142: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.33148: variable 'omit' from source: magic vars 15621 1726882628.33185: variable 'omit' from source: magic vars 15621 1726882628.33258: variable 'profile' from source: include params 15621 1726882628.33263: variable 'interface' from source: set_fact 15621 1726882628.33315: variable 'interface' from source: set_fact 15621 1726882628.33332: variable 'omit' from source: magic vars 15621 1726882628.33366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882628.33403: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882628.33420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882628.33437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.33448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.33474: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882628.33480: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.33482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.33559: Set connection var ansible_connection to ssh 15621 1726882628.33567: Set connection var ansible_shell_executable to /bin/sh 15621 1726882628.33572: Set connection var ansible_timeout to 10 15621 1726882628.33575: Set connection var ansible_shell_type to sh 15621 1726882628.33582: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882628.33588: Set connection var ansible_pipelining to False 15621 1726882628.33611: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.33614: variable 'ansible_connection' from source: unknown 15621 1726882628.33617: variable 'ansible_module_compression' from source: unknown 15621 1726882628.33619: variable 'ansible_shell_type' from source: unknown 15621 1726882628.33626: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.33630: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.33632: variable 'ansible_pipelining' from source: unknown 15621 1726882628.33636: variable 'ansible_timeout' from source: unknown 15621 1726882628.33638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.33753: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882628.33763: variable 'omit' from source: magic vars 15621 1726882628.33768: starting attempt loop 15621 1726882628.33771: running the handler 15621 1726882628.33863: variable 'lsr_net_profile_exists' from source: set_fact 15621 1726882628.33867: Evaluated conditional (not lsr_net_profile_exists): True 15621 1726882628.33874: handler run complete 15621 1726882628.33891: attempt loop complete, returning result 15621 1726882628.33894: _execute() done 15621 1726882628.33897: dumping result to json 15621 1726882628.33900: done dumping result, returning 15621 1726882628.33907: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'lsr27' [0affc7ec-ae25-af1a-5b92-0000000004f6] 15621 1726882628.33912: sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004f6 15621 1726882628.34003: done sending task result for task 0affc7ec-ae25-af1a-5b92-0000000004f6 15621 1726882628.34006: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15621 1726882628.34097: no more pending results, returning what we have 15621 1726882628.34100: results queue empty 15621 1726882628.34101: checking for any_errors_fatal 15621 1726882628.34111: done checking for any_errors_fatal 15621 1726882628.34111: checking for max_fail_percentage 15621 1726882628.34113: done checking for max_fail_percentage 15621 1726882628.34116: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.34117: done checking to see if all hosts have failed 15621 1726882628.34117: getting the remaining hosts for this loop 15621 1726882628.34119: done getting the remaining hosts for this loop 15621 1726882628.34123: getting the next task for host managed_node3 15621 1726882628.34133: done getting next task for host managed_node3 15621 1726882628.34136: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15621 1726882628.34138: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.34142: getting variables 15621 1726882628.34144: in VariableManager get_vars() 15621 1726882628.34172: Calling all_inventory to load vars for managed_node3 15621 1726882628.34175: Calling groups_inventory to load vars for managed_node3 15621 1726882628.34178: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.34189: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.34191: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.34194: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.35269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.36430: done with get_vars() 15621 1726882628.36450: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 21:37:08 -0400 (0:00:00.044) 0:01:00.444 ****** 15621 1726882628.36530: entering _queue_task() for managed_node3/include_tasks 15621 1726882628.36813: worker is 1 (out of 1 available) 15621 1726882628.36831: exiting _queue_task() for managed_node3/include_tasks 15621 1726882628.36847: done queuing things up, now waiting for results queue to drain 15621 1726882628.36849: waiting for pending results... 15621 1726882628.37045: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' 15621 1726882628.37118: in run() - task 0affc7ec-ae25-af1a-5b92-000000000075 15621 1726882628.37134: variable 'ansible_search_path' from source: unknown 15621 1726882628.37167: calling self._execute() 15621 1726882628.37251: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.37257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.37268: variable 'omit' from source: magic vars 15621 1726882628.37576: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.37589: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.37595: _execute() done 15621 1726882628.37599: dumping result to json 15621 1726882628.37604: done dumping result, returning 15621 1726882628.37610: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' [0affc7ec-ae25-af1a-5b92-000000000075] 15621 1726882628.37620: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000075 15621 1726882628.37714: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000075 15621 1726882628.37717: WORKER PROCESS EXITING 15621 1726882628.37752: no more pending results, returning what we have 15621 1726882628.37757: in VariableManager get_vars() 15621 1726882628.37794: Calling all_inventory to load vars for managed_node3 15621 1726882628.37797: Calling groups_inventory to load vars for managed_node3 15621 1726882628.37801: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.37817: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.37819: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.37824: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.38927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.40082: done with get_vars() 15621 1726882628.40105: variable 'ansible_search_path' from source: unknown 15621 1726882628.40119: we have included files to process 15621 1726882628.40120: generating all_blocks data 15621 1726882628.40123: done generating all_blocks data 15621 1726882628.40127: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15621 1726882628.40128: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15621 1726882628.40130: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15621 1726882628.40255: in VariableManager get_vars() 15621 1726882628.40267: done with get_vars() 15621 1726882628.40354: done processing included file 15621 1726882628.40356: iterating over new_blocks loaded from include file 15621 1726882628.40357: in VariableManager get_vars() 15621 1726882628.40365: done with get_vars() 15621 1726882628.40366: filtering new block on tags 15621 1726882628.40379: done filtering new block on tags 15621 1726882628.40381: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 15621 1726882628.40385: extending task lists for all hosts with included blocks 15621 1726882628.40482: done extending task lists 15621 1726882628.40483: done processing included files 15621 1726882628.40484: results queue empty 15621 1726882628.40484: checking for any_errors_fatal 15621 1726882628.40487: done checking for any_errors_fatal 15621 1726882628.40488: checking for max_fail_percentage 15621 1726882628.40489: done checking for max_fail_percentage 15621 1726882628.40489: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.40490: done checking to see if all hosts have failed 15621 1726882628.40490: getting the remaining hosts for this loop 15621 1726882628.40491: done getting the remaining hosts for this loop 15621 1726882628.40493: getting the next task for host managed_node3 15621 1726882628.40495: done getting next task for host managed_node3 15621 1726882628.40497: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15621 1726882628.40498: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.40500: getting variables 15621 1726882628.40500: in VariableManager get_vars() 15621 1726882628.40507: Calling all_inventory to load vars for managed_node3 15621 1726882628.40508: Calling groups_inventory to load vars for managed_node3 15621 1726882628.40510: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.40515: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.40516: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.40518: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.41356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.42493: done with get_vars() 15621 1726882628.42514: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.060) 0:01:00.505 ****** 15621 1726882628.42579: entering _queue_task() for managed_node3/include_tasks 15621 1726882628.42872: worker is 1 (out of 1 available) 15621 1726882628.42889: exiting _queue_task() for managed_node3/include_tasks 15621 1726882628.42903: done queuing things up, now waiting for results queue to drain 15621 1726882628.42905: waiting for pending results... 15621 1726882628.43100: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 15621 1726882628.43193: in run() - task 0affc7ec-ae25-af1a-5b92-00000000053c 15621 1726882628.43205: variable 'ansible_search_path' from source: unknown 15621 1726882628.43209: variable 'ansible_search_path' from source: unknown 15621 1726882628.43244: calling self._execute() 15621 1726882628.43324: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.43330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.43340: variable 'omit' from source: magic vars 15621 1726882628.43643: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.43653: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.43660: _execute() done 15621 1726882628.43664: dumping result to json 15621 1726882628.43669: done dumping result, returning 15621 1726882628.43675: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-af1a-5b92-00000000053c] 15621 1726882628.43689: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000053c 15621 1726882628.43772: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000053c 15621 1726882628.43776: WORKER PROCESS EXITING 15621 1726882628.43808: no more pending results, returning what we have 15621 1726882628.43813: in VariableManager get_vars() 15621 1726882628.43852: Calling all_inventory to load vars for managed_node3 15621 1726882628.43855: Calling groups_inventory to load vars for managed_node3 15621 1726882628.43859: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.43875: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.43878: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.43881: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.44934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.46101: done with get_vars() 15621 1726882628.46120: variable 'ansible_search_path' from source: unknown 15621 1726882628.46121: variable 'ansible_search_path' from source: unknown 15621 1726882628.46154: we have included files to process 15621 1726882628.46155: generating all_blocks data 15621 1726882628.46156: done generating all_blocks data 15621 1726882628.46157: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882628.46158: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882628.46159: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15621 1726882628.46309: done processing included file 15621 1726882628.46311: iterating over new_blocks loaded from include file 15621 1726882628.46313: in VariableManager get_vars() 15621 1726882628.46325: done with get_vars() 15621 1726882628.46326: filtering new block on tags 15621 1726882628.46337: done filtering new block on tags 15621 1726882628.46339: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 15621 1726882628.46343: extending task lists for all hosts with included blocks 15621 1726882628.46408: done extending task lists 15621 1726882628.46409: done processing included files 15621 1726882628.46410: results queue empty 15621 1726882628.46411: checking for any_errors_fatal 15621 1726882628.46413: done checking for any_errors_fatal 15621 1726882628.46414: checking for max_fail_percentage 15621 1726882628.46414: done checking for max_fail_percentage 15621 1726882628.46415: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.46416: done checking to see if all hosts have failed 15621 1726882628.46416: getting the remaining hosts for this loop 15621 1726882628.46417: done getting the remaining hosts for this loop 15621 1726882628.46419: getting the next task for host managed_node3 15621 1726882628.46424: done getting next task for host managed_node3 15621 1726882628.46426: ^ task is: TASK: Get stat for interface {{ interface }} 15621 1726882628.46429: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.46431: getting variables 15621 1726882628.46432: in VariableManager get_vars() 15621 1726882628.46439: Calling all_inventory to load vars for managed_node3 15621 1726882628.46440: Calling groups_inventory to load vars for managed_node3 15621 1726882628.46442: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.46446: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.46448: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.46450: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.47335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.48484: done with get_vars() 15621 1726882628.48506: done getting variables 15621 1726882628.48638: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.060) 0:01:00.565 ****** 15621 1726882628.48663: entering _queue_task() for managed_node3/stat 15621 1726882628.48961: worker is 1 (out of 1 available) 15621 1726882628.48978: exiting _queue_task() for managed_node3/stat 15621 1726882628.48992: done queuing things up, now waiting for results queue to drain 15621 1726882628.48994: waiting for pending results... 15621 1726882628.49240: running TaskExecutor() for managed_node3/TASK: Get stat for interface lsr27 15621 1726882628.49294: in run() - task 0affc7ec-ae25-af1a-5b92-000000000554 15621 1726882628.49309: variable 'ansible_search_path' from source: unknown 15621 1726882628.49312: variable 'ansible_search_path' from source: unknown 15621 1726882628.49346: calling self._execute() 15621 1726882628.49420: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.49426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.49438: variable 'omit' from source: magic vars 15621 1726882628.49735: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.49746: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.49752: variable 'omit' from source: magic vars 15621 1726882628.49788: variable 'omit' from source: magic vars 15621 1726882628.49864: variable 'interface' from source: set_fact 15621 1726882628.49880: variable 'omit' from source: magic vars 15621 1726882628.49927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882628.49958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882628.49979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882628.49992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.50003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.50033: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882628.50037: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.50040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.50114: Set connection var ansible_connection to ssh 15621 1726882628.50123: Set connection var ansible_shell_executable to /bin/sh 15621 1726882628.50128: Set connection var ansible_timeout to 10 15621 1726882628.50138: Set connection var ansible_shell_type to sh 15621 1726882628.50143: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882628.50149: Set connection var ansible_pipelining to False 15621 1726882628.50170: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.50173: variable 'ansible_connection' from source: unknown 15621 1726882628.50175: variable 'ansible_module_compression' from source: unknown 15621 1726882628.50180: variable 'ansible_shell_type' from source: unknown 15621 1726882628.50183: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.50185: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.50191: variable 'ansible_pipelining' from source: unknown 15621 1726882628.50194: variable 'ansible_timeout' from source: unknown 15621 1726882628.50197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.50373: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15621 1726882628.50385: variable 'omit' from source: magic vars 15621 1726882628.50390: starting attempt loop 15621 1726882628.50393: running the handler 15621 1726882628.50405: _low_level_execute_command(): starting 15621 1726882628.50412: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882628.50977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882628.50981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.50986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.51048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.51051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882628.51056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.51149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.52922: stdout chunk (state=3): >>>/root <<< 15621 1726882628.53026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.53087: stderr chunk (state=3): >>><<< 15621 1726882628.53091: stdout chunk (state=3): >>><<< 15621 1726882628.53113: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882628.53129: _low_level_execute_command(): starting 15621 1726882628.53135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541 `" && echo ansible-tmp-1726882628.5311399-17806-80085932424541="` echo /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541 `" ) && sleep 0' 15621 1726882628.53622: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882628.53626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882628.53630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.53662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882628.53666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.53682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.53699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.53788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.55759: stdout chunk (state=3): >>>ansible-tmp-1726882628.5311399-17806-80085932424541=/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541 <<< 15621 1726882628.55874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.55934: stderr chunk (state=3): >>><<< 15621 1726882628.55938: stdout chunk (state=3): >>><<< 15621 1726882628.55954: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882628.5311399-17806-80085932424541=/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882628.55998: variable 'ansible_module_compression' from source: unknown 15621 1726882628.56048: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15621 1726882628.56086: variable 'ansible_facts' from source: unknown 15621 1726882628.56144: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py 15621 1726882628.56261: Sending initial data 15621 1726882628.56265: Sent initial data (152 bytes) 15621 1726882628.56749: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882628.56753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 <<< 15621 1726882628.56755: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.56758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882628.56760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.56817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.56828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.56907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.58493: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15621 1726882628.58497: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882628.58575: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882628.58664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpr83f_fop /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py <<< 15621 1726882628.58667: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py" <<< 15621 1726882628.58743: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpr83f_fop" to remote "/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py" <<< 15621 1726882628.58751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py" <<< 15621 1726882628.59457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.59534: stderr chunk (state=3): >>><<< 15621 1726882628.59537: stdout chunk (state=3): >>><<< 15621 1726882628.59557: done transferring module to remote 15621 1726882628.59569: _low_level_execute_command(): starting 15621 1726882628.59576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/ /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py && sleep 0' 15621 1726882628.60073: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882628.60077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882628.60080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.60082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882628.60089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882628.60092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.60143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.60148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882628.60150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.60233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.62033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.62090: stderr chunk (state=3): >>><<< 15621 1726882628.62093: stdout chunk (state=3): >>><<< 15621 1726882628.62108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882628.62111: _low_level_execute_command(): starting 15621 1726882628.62117: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/AnsiballZ_stat.py && sleep 0' 15621 1726882628.62605: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882628.62608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.62611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882628.62614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.62667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.62675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882628.62677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.62761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.79178: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15621 1726882628.80371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882628.80437: stderr chunk (state=3): >>><<< 15621 1726882628.80441: stdout chunk (state=3): >>><<< 15621 1726882628.80457: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882628.80484: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882628.80495: _low_level_execute_command(): starting 15621 1726882628.80501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882628.5311399-17806-80085932424541/ > /dev/null 2>&1 && sleep 0' 15621 1726882628.81007: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882628.81011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882628.81013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.81015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882628.81024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882628.81026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882628.81078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882628.81082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882628.81084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882628.81172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882628.83096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882628.83102: stderr chunk (state=3): >>><<< 15621 1726882628.83105: stdout chunk (state=3): >>><<< 15621 1726882628.83124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882628.83131: handler run complete 15621 1726882628.83149: attempt loop complete, returning result 15621 1726882628.83152: _execute() done 15621 1726882628.83154: dumping result to json 15621 1726882628.83159: done dumping result, returning 15621 1726882628.83167: done running TaskExecutor() for managed_node3/TASK: Get stat for interface lsr27 [0affc7ec-ae25-af1a-5b92-000000000554] 15621 1726882628.83172: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000554 15621 1726882628.83277: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000554 15621 1726882628.83280: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15621 1726882628.83367: no more pending results, returning what we have 15621 1726882628.83372: results queue empty 15621 1726882628.83373: checking for any_errors_fatal 15621 1726882628.83375: done checking for any_errors_fatal 15621 1726882628.83376: checking for max_fail_percentage 15621 1726882628.83377: done checking for max_fail_percentage 15621 1726882628.83378: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.83379: done checking to see if all hosts have failed 15621 1726882628.83380: getting the remaining hosts for this loop 15621 1726882628.83381: done getting the remaining hosts for this loop 15621 1726882628.83386: getting the next task for host managed_node3 15621 1726882628.83396: done getting next task for host managed_node3 15621 1726882628.83401: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15621 1726882628.83404: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.83408: getting variables 15621 1726882628.83410: in VariableManager get_vars() 15621 1726882628.83442: Calling all_inventory to load vars for managed_node3 15621 1726882628.83446: Calling groups_inventory to load vars for managed_node3 15621 1726882628.83449: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.83462: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.83465: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.83468: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.84478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.85645: done with get_vars() 15621 1726882628.85668: done getting variables 15621 1726882628.85726: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15621 1726882628.85821: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:37:08 -0400 (0:00:00.371) 0:01:00.937 ****** 15621 1726882628.85848: entering _queue_task() for managed_node3/assert 15621 1726882628.86139: worker is 1 (out of 1 available) 15621 1726882628.86154: exiting _queue_task() for managed_node3/assert 15621 1726882628.86169: done queuing things up, now waiting for results queue to drain 15621 1726882628.86171: waiting for pending results... 15621 1726882628.86358: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'lsr27' 15621 1726882628.86442: in run() - task 0affc7ec-ae25-af1a-5b92-00000000053d 15621 1726882628.86453: variable 'ansible_search_path' from source: unknown 15621 1726882628.86457: variable 'ansible_search_path' from source: unknown 15621 1726882628.86492: calling self._execute() 15621 1726882628.86566: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.86570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.86581: variable 'omit' from source: magic vars 15621 1726882628.86885: variable 'ansible_distribution_major_version' from source: facts 15621 1726882628.86896: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882628.86902: variable 'omit' from source: magic vars 15621 1726882628.86936: variable 'omit' from source: magic vars 15621 1726882628.87013: variable 'interface' from source: set_fact 15621 1726882628.87029: variable 'omit' from source: magic vars 15621 1726882628.87070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882628.87099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882628.87125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882628.87140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.87151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882628.87181: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882628.87184: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.87188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.87264: Set connection var ansible_connection to ssh 15621 1726882628.87272: Set connection var ansible_shell_executable to /bin/sh 15621 1726882628.87284: Set connection var ansible_timeout to 10 15621 1726882628.87288: Set connection var ansible_shell_type to sh 15621 1726882628.87291: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882628.87293: Set connection var ansible_pipelining to False 15621 1726882628.87312: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.87315: variable 'ansible_connection' from source: unknown 15621 1726882628.87318: variable 'ansible_module_compression' from source: unknown 15621 1726882628.87321: variable 'ansible_shell_type' from source: unknown 15621 1726882628.87325: variable 'ansible_shell_executable' from source: unknown 15621 1726882628.87328: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882628.87335: variable 'ansible_pipelining' from source: unknown 15621 1726882628.87338: variable 'ansible_timeout' from source: unknown 15621 1726882628.87340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882628.87453: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882628.87462: variable 'omit' from source: magic vars 15621 1726882628.87467: starting attempt loop 15621 1726882628.87470: running the handler 15621 1726882628.87580: variable 'interface_stat' from source: set_fact 15621 1726882628.87588: Evaluated conditional (not interface_stat.stat.exists): True 15621 1726882628.87595: handler run complete 15621 1726882628.87613: attempt loop complete, returning result 15621 1726882628.87617: _execute() done 15621 1726882628.87619: dumping result to json 15621 1726882628.87623: done dumping result, returning 15621 1726882628.87626: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'lsr27' [0affc7ec-ae25-af1a-5b92-00000000053d] 15621 1726882628.87632: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000053d 15621 1726882628.87723: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000053d 15621 1726882628.87727: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15621 1726882628.87780: no more pending results, returning what we have 15621 1726882628.87784: results queue empty 15621 1726882628.87785: checking for any_errors_fatal 15621 1726882628.87793: done checking for any_errors_fatal 15621 1726882628.87794: checking for max_fail_percentage 15621 1726882628.87795: done checking for max_fail_percentage 15621 1726882628.87796: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.87798: done checking to see if all hosts have failed 15621 1726882628.87798: getting the remaining hosts for this loop 15621 1726882628.87800: done getting the remaining hosts for this loop 15621 1726882628.87804: getting the next task for host managed_node3 15621 1726882628.87814: done getting next task for host managed_node3 15621 1726882628.87816: ^ task is: TASK: meta (flush_handlers) 15621 1726882628.87818: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.87825: getting variables 15621 1726882628.87826: in VariableManager get_vars() 15621 1726882628.87857: Calling all_inventory to load vars for managed_node3 15621 1726882628.87860: Calling groups_inventory to load vars for managed_node3 15621 1726882628.87863: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.87878: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.87881: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.87884: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.89182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.90337: done with get_vars() 15621 1726882628.90357: done getting variables 15621 1726882628.90444: in VariableManager get_vars() 15621 1726882628.90455: Calling all_inventory to load vars for managed_node3 15621 1726882628.90457: Calling groups_inventory to load vars for managed_node3 15621 1726882628.90459: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.90465: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.90467: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.90470: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.92051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.94207: done with get_vars() 15621 1726882628.94252: done queuing things up, now waiting for results queue to drain 15621 1726882628.94254: results queue empty 15621 1726882628.94255: checking for any_errors_fatal 15621 1726882628.94258: done checking for any_errors_fatal 15621 1726882628.94259: checking for max_fail_percentage 15621 1726882628.94261: done checking for max_fail_percentage 15621 1726882628.94261: checking to see if all hosts have failed and the running result is not ok 15621 1726882628.94262: done checking to see if all hosts have failed 15621 1726882628.94269: getting the remaining hosts for this loop 15621 1726882628.94270: done getting the remaining hosts for this loop 15621 1726882628.94273: getting the next task for host managed_node3 15621 1726882628.94280: done getting next task for host managed_node3 15621 1726882628.94281: ^ task is: TASK: meta (flush_handlers) 15621 1726882628.94283: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882628.94286: getting variables 15621 1726882628.94287: in VariableManager get_vars() 15621 1726882628.94298: Calling all_inventory to load vars for managed_node3 15621 1726882628.94301: Calling groups_inventory to load vars for managed_node3 15621 1726882628.94303: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.94309: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.94312: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.94315: Calling groups_plugins_play to load vars for managed_node3 15621 1726882628.95392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882628.96533: done with get_vars() 15621 1726882628.96552: done getting variables 15621 1726882628.96594: in VariableManager get_vars() 15621 1726882628.96604: Calling all_inventory to load vars for managed_node3 15621 1726882628.96606: Calling groups_inventory to load vars for managed_node3 15621 1726882628.96608: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882628.96611: Calling all_plugins_play to load vars for managed_node3 15621 1726882628.96613: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882628.96615: Calling groups_plugins_play to load vars for managed_node3 15621 1726882629.02905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882629.05183: done with get_vars() 15621 1726882629.05225: done queuing things up, now waiting for results queue to drain 15621 1726882629.05228: results queue empty 15621 1726882629.05229: checking for any_errors_fatal 15621 1726882629.05231: done checking for any_errors_fatal 15621 1726882629.05232: checking for max_fail_percentage 15621 1726882629.05233: done checking for max_fail_percentage 15621 1726882629.05234: checking to see if all hosts have failed and the running result is not ok 15621 1726882629.05234: done checking to see if all hosts have failed 15621 1726882629.05235: getting the remaining hosts for this loop 15621 1726882629.05236: done getting the remaining hosts for this loop 15621 1726882629.05240: getting the next task for host managed_node3 15621 1726882629.05244: done getting next task for host managed_node3 15621 1726882629.05245: ^ task is: None 15621 1726882629.05247: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882629.05248: done queuing things up, now waiting for results queue to drain 15621 1726882629.05249: results queue empty 15621 1726882629.05250: checking for any_errors_fatal 15621 1726882629.05251: done checking for any_errors_fatal 15621 1726882629.05252: checking for max_fail_percentage 15621 1726882629.05253: done checking for max_fail_percentage 15621 1726882629.05254: checking to see if all hosts have failed and the running result is not ok 15621 1726882629.05254: done checking to see if all hosts have failed 15621 1726882629.05256: getting the next task for host managed_node3 15621 1726882629.05258: done getting next task for host managed_node3 15621 1726882629.05259: ^ task is: None 15621 1726882629.05261: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882629.05304: in VariableManager get_vars() 15621 1726882629.05325: done with get_vars() 15621 1726882629.05332: in VariableManager get_vars() 15621 1726882629.05342: done with get_vars() 15621 1726882629.05346: variable 'omit' from source: magic vars 15621 1726882629.05379: in VariableManager get_vars() 15621 1726882629.05388: done with get_vars() 15621 1726882629.05412: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15621 1726882629.05777: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15621 1726882629.06129: getting the remaining hosts for this loop 15621 1726882629.06130: done getting the remaining hosts for this loop 15621 1726882629.06133: getting the next task for host managed_node3 15621 1726882629.06136: done getting next task for host managed_node3 15621 1726882629.06138: ^ task is: TASK: Gathering Facts 15621 1726882629.06139: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882629.06141: getting variables 15621 1726882629.06142: in VariableManager get_vars() 15621 1726882629.06151: Calling all_inventory to load vars for managed_node3 15621 1726882629.06153: Calling groups_inventory to load vars for managed_node3 15621 1726882629.06156: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882629.06162: Calling all_plugins_play to load vars for managed_node3 15621 1726882629.06165: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882629.06167: Calling groups_plugins_play to load vars for managed_node3 15621 1726882629.09632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882629.11931: done with get_vars() 15621 1726882629.11963: done getting variables 15621 1726882629.12016: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 21:37:09 -0400 (0:00:00.261) 0:01:01.199 ****** 15621 1726882629.12046: entering _queue_task() for managed_node3/gather_facts 15621 1726882629.12425: worker is 1 (out of 1 available) 15621 1726882629.12440: exiting _queue_task() for managed_node3/gather_facts 15621 1726882629.12454: done queuing things up, now waiting for results queue to drain 15621 1726882629.12455: waiting for pending results... 15621 1726882629.12700: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15621 1726882629.12815: in run() - task 0affc7ec-ae25-af1a-5b92-00000000056d 15621 1726882629.12846: variable 'ansible_search_path' from source: unknown 15621 1726882629.12941: calling self._execute() 15621 1726882629.13000: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882629.13014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882629.13031: variable 'omit' from source: magic vars 15621 1726882629.13456: variable 'ansible_distribution_major_version' from source: facts 15621 1726882629.13475: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882629.13491: variable 'omit' from source: magic vars 15621 1726882629.13593: variable 'omit' from source: magic vars 15621 1726882629.13597: variable 'omit' from source: magic vars 15621 1726882629.13620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882629.13669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882629.13702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882629.13729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882629.13747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882629.13786: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882629.13795: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882629.13807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882629.13922: Set connection var ansible_connection to ssh 15621 1726882629.14030: Set connection var ansible_shell_executable to /bin/sh 15621 1726882629.14033: Set connection var ansible_timeout to 10 15621 1726882629.14036: Set connection var ansible_shell_type to sh 15621 1726882629.14038: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882629.14040: Set connection var ansible_pipelining to False 15621 1726882629.14041: variable 'ansible_shell_executable' from source: unknown 15621 1726882629.14043: variable 'ansible_connection' from source: unknown 15621 1726882629.14045: variable 'ansible_module_compression' from source: unknown 15621 1726882629.14047: variable 'ansible_shell_type' from source: unknown 15621 1726882629.14049: variable 'ansible_shell_executable' from source: unknown 15621 1726882629.14053: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882629.14055: variable 'ansible_pipelining' from source: unknown 15621 1726882629.14059: variable 'ansible_timeout' from source: unknown 15621 1726882629.14062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882629.14218: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882629.14315: variable 'omit' from source: magic vars 15621 1726882629.14336: starting attempt loop 15621 1726882629.14349: running the handler 15621 1726882629.14382: variable 'ansible_facts' from source: unknown 15621 1726882629.14419: _low_level_execute_command(): starting 15621 1726882629.14440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882629.15282: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882629.15339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882629.15345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882629.15373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882629.15438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882629.15468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882629.15488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882629.15539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882629.15624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882629.17417: stdout chunk (state=3): >>>/root <<< 15621 1726882629.17838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882629.17841: stdout chunk (state=3): >>><<< 15621 1726882629.17843: stderr chunk (state=3): >>><<< 15621 1726882629.18054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882629.18059: _low_level_execute_command(): starting 15621 1726882629.18062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986 `" && echo ansible-tmp-1726882629.1786387-17824-254519997196986="` echo /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986 `" ) && sleep 0' 15621 1726882629.18990: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882629.19041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882629.19066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882629.19100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882629.19179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882629.21529: stdout chunk (state=3): >>>ansible-tmp-1726882629.1786387-17824-254519997196986=/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986 <<< 15621 1726882629.21533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882629.21535: stdout chunk (state=3): >>><<< 15621 1726882629.21537: stderr chunk (state=3): >>><<< 15621 1726882629.21540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882629.1786387-17824-254519997196986=/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882629.21542: variable 'ansible_module_compression' from source: unknown 15621 1726882629.21595: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15621 1726882629.21775: variable 'ansible_facts' from source: unknown 15621 1726882629.22097: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py 15621 1726882629.22588: Sending initial data 15621 1726882629.22592: Sent initial data (154 bytes) 15621 1726882629.24221: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882629.24485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882629.24488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882629.24492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882629.24494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882629.24614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882629.26217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882629.26321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882629.26425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvh4mzxd1 /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py <<< 15621 1726882629.26441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py" <<< 15621 1726882629.26585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmpvh4mzxd1" to remote "/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py" <<< 15621 1726882629.29976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882629.29981: stderr chunk (state=3): >>><<< 15621 1726882629.29984: stdout chunk (state=3): >>><<< 15621 1726882629.29987: done transferring module to remote 15621 1726882629.29989: _low_level_execute_command(): starting 15621 1726882629.29992: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/ /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py && sleep 0' 15621 1726882629.31241: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882629.31425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882629.31668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882629.31757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882629.33772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882629.33905: stderr chunk (state=3): >>><<< 15621 1726882629.33908: stdout chunk (state=3): >>><<< 15621 1726882629.33917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882629.33926: _low_level_execute_command(): starting 15621 1726882629.33929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/AnsiballZ_setup.py && sleep 0' 15621 1726882629.35006: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882629.35013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882629.35016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882629.35018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882629.35370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882629.35467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.50591: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3106, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 610, "free": 3106}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 775, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384107008, "block_size": 4096, "block_total": 64483404, "block_available": 61373073, "block_used": 3110331, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "11", "epoch": "1726882631", "epoch_int": "1726882631", "date": "2024-09-20", "time": "21:37:11", "iso8601_micro": "2024-09-21T01:37:11.501485Z", "iso8601": "2024-09-21T01:37:11Z", "iso8601_basic": "20240920T213711501485", "iso8601_basic_short": "20240920T213711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.5927734375, "15m": 0.3203125}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15621 1726882631.52564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882631.52618: stderr chunk (state=3): >>><<< 15621 1726882631.52624: stdout chunk (state=3): >>><<< 15621 1726882631.52649: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCqxUjCEqNblrTyW6Uf6mIYxca8N+p8oJNuOoXU65bGRNg3CMa5WjaOdqcLJoa5cqHU94Eb2GKTTyez0hcVUk7tsi3NxQudVrBDJQwbGPLKwHTfAOeffQrSKU6cQIc1wl+jLeNyQet7t+mRPHDLLjdsLuWud7KDSFY7tB05hqCIT7br7Ql/dFhcnCdWQFQMOHFOz3ScJe9gey/LD3ji7GRONjSr/t5cpKmB6mxzEmsb1n6YZdbP8HCphGcvKR4W+uaX3gVfQE0qvrqlobTyex8yIrkML2bRGO0cQ0YQWRYUwl+2NZufO8pixR1WlzvjooEQLCa77cJ6SZ8LyFkDOI+mMyuj8kcM9hS4AD91rPxl8C0d6Jg8RKqnImxC3X/NNaRYHqlewUGo6VKkcO4+lxgJGqYFcmkGEHzq4fuf6gtrr3rJkcIFcrluI0mSyZ2wXzI9K1OLHK0fnDvDUdV21RdTxfpz2ZFqykIWxdtugE4qaNMgbtV0VnufdkfZoCt9ayU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCVkskQ7Wf194qJaR5aLJzIbxDeKLsVL0wQFKV8r0F7GGZAGvI7/LHajoQ1NvRR35h4P+UpQQWPriVBtLfXYfXQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHICQbhAvKstSrwCX3R+nlPjOjLF0EHt/gL32n1ZS9Xl", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-226", "ansible_nodename": "ip-10-31-45-226.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ea14692d25e88f0b7167787b368d", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::19:daff:feea:a3f3", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.226", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:19:da:ea:a3:f3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.226"], "ansible_all_ipv6_addresses": ["fe80::19:daff:feea:a3f3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.226", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::19:daff:feea:a3f3"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 36814 10.31.45.226 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 36814 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3106, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 610, "free": 3106}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_uuid": "ec22ea14-692d-25e8-8f0b-7167787b368d", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 775, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384107008, "block_size": 4096, "block_total": 64483404, "block_available": 61373073, "block_used": 3110331, "inode_total": 16384000, "inode_available": 16303144, "inode_used": 80856, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "11", "epoch": "1726882631", "epoch_int": "1726882631", "date": "2024-09-20", "time": "21:37:11", "iso8601_micro": "2024-09-21T01:37:11.501485Z", "iso8601": "2024-09-21T01:37:11Z", "iso8601_basic": "20240920T213711501485", "iso8601_basic_short": "20240920T213711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.5927734375, "15m": 0.3203125}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882631.52955: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882631.52958: _low_level_execute_command(): starting 15621 1726882631.52961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882629.1786387-17824-254519997196986/ > /dev/null 2>&1 && sleep 0' 15621 1726882631.53595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.53599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.53606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.53705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.55598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882631.55644: stderr chunk (state=3): >>><<< 15621 1726882631.55651: stdout chunk (state=3): >>><<< 15621 1726882631.55729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882631.55733: handler run complete 15621 1726882631.55766: variable 'ansible_facts' from source: unknown 15621 1726882631.55839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.56036: variable 'ansible_facts' from source: unknown 15621 1726882631.56097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.56177: attempt loop complete, returning result 15621 1726882631.56181: _execute() done 15621 1726882631.56184: dumping result to json 15621 1726882631.56201: done dumping result, returning 15621 1726882631.56208: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affc7ec-ae25-af1a-5b92-00000000056d] 15621 1726882631.56214: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000056d 15621 1726882631.56472: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000056d 15621 1726882631.56475: WORKER PROCESS EXITING ok: [managed_node3] 15621 1726882631.56706: no more pending results, returning what we have 15621 1726882631.56708: results queue empty 15621 1726882631.56709: checking for any_errors_fatal 15621 1726882631.56710: done checking for any_errors_fatal 15621 1726882631.56710: checking for max_fail_percentage 15621 1726882631.56711: done checking for max_fail_percentage 15621 1726882631.56712: checking to see if all hosts have failed and the running result is not ok 15621 1726882631.56713: done checking to see if all hosts have failed 15621 1726882631.56713: getting the remaining hosts for this loop 15621 1726882631.56714: done getting the remaining hosts for this loop 15621 1726882631.56717: getting the next task for host managed_node3 15621 1726882631.56721: done getting next task for host managed_node3 15621 1726882631.56724: ^ task is: TASK: meta (flush_handlers) 15621 1726882631.56725: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882631.56728: getting variables 15621 1726882631.56729: in VariableManager get_vars() 15621 1726882631.56747: Calling all_inventory to load vars for managed_node3 15621 1726882631.56748: Calling groups_inventory to load vars for managed_node3 15621 1726882631.56751: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882631.56759: Calling all_plugins_play to load vars for managed_node3 15621 1726882631.56761: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882631.56763: Calling groups_plugins_play to load vars for managed_node3 15621 1726882631.58170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.59425: done with get_vars() 15621 1726882631.59442: done getting variables 15621 1726882631.59494: in VariableManager get_vars() 15621 1726882631.59502: Calling all_inventory to load vars for managed_node3 15621 1726882631.59503: Calling groups_inventory to load vars for managed_node3 15621 1726882631.59505: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882631.59509: Calling all_plugins_play to load vars for managed_node3 15621 1726882631.59511: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882631.59514: Calling groups_plugins_play to load vars for managed_node3 15621 1726882631.60320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.61472: done with get_vars() 15621 1726882631.61492: done queuing things up, now waiting for results queue to drain 15621 1726882631.61494: results queue empty 15621 1726882631.61495: checking for any_errors_fatal 15621 1726882631.61497: done checking for any_errors_fatal 15621 1726882631.61498: checking for max_fail_percentage 15621 1726882631.61499: done checking for max_fail_percentage 15621 1726882631.61499: checking to see if all hosts have failed and the running result is not ok 15621 1726882631.61500: done checking to see if all hosts have failed 15621 1726882631.61503: getting the remaining hosts for this loop 15621 1726882631.61504: done getting the remaining hosts for this loop 15621 1726882631.61506: getting the next task for host managed_node3 15621 1726882631.61509: done getting next task for host managed_node3 15621 1726882631.61511: ^ task is: TASK: Verify network state restored to default 15621 1726882631.61512: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882631.61514: getting variables 15621 1726882631.61514: in VariableManager get_vars() 15621 1726882631.61521: Calling all_inventory to load vars for managed_node3 15621 1726882631.61524: Calling groups_inventory to load vars for managed_node3 15621 1726882631.61526: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882631.61530: Calling all_plugins_play to load vars for managed_node3 15621 1726882631.61531: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882631.61533: Calling groups_plugins_play to load vars for managed_node3 15621 1726882631.62367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.63490: done with get_vars() 15621 1726882631.63505: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 21:37:11 -0400 (0:00:02.515) 0:01:03.714 ****** 15621 1726882631.63562: entering _queue_task() for managed_node3/include_tasks 15621 1726882631.63830: worker is 1 (out of 1 available) 15621 1726882631.63845: exiting _queue_task() for managed_node3/include_tasks 15621 1726882631.63856: done queuing things up, now waiting for results queue to drain 15621 1726882631.63858: waiting for pending results... 15621 1726882631.64052: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 15621 1726882631.64137: in run() - task 0affc7ec-ae25-af1a-5b92-000000000078 15621 1726882631.64151: variable 'ansible_search_path' from source: unknown 15621 1726882631.64186: calling self._execute() 15621 1726882631.64264: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882631.64268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882631.64280: variable 'omit' from source: magic vars 15621 1726882631.64580: variable 'ansible_distribution_major_version' from source: facts 15621 1726882631.64589: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882631.64595: _execute() done 15621 1726882631.64598: dumping result to json 15621 1726882631.64602: done dumping result, returning 15621 1726882631.64610: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0affc7ec-ae25-af1a-5b92-000000000078] 15621 1726882631.64615: sending task result for task 0affc7ec-ae25-af1a-5b92-000000000078 15621 1726882631.64718: done sending task result for task 0affc7ec-ae25-af1a-5b92-000000000078 15621 1726882631.64724: WORKER PROCESS EXITING 15621 1726882631.64761: no more pending results, returning what we have 15621 1726882631.64766: in VariableManager get_vars() 15621 1726882631.64801: Calling all_inventory to load vars for managed_node3 15621 1726882631.64804: Calling groups_inventory to load vars for managed_node3 15621 1726882631.64808: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882631.64820: Calling all_plugins_play to load vars for managed_node3 15621 1726882631.64824: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882631.64828: Calling groups_plugins_play to load vars for managed_node3 15621 1726882631.65762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.67001: done with get_vars() 15621 1726882631.67015: variable 'ansible_search_path' from source: unknown 15621 1726882631.67027: we have included files to process 15621 1726882631.67027: generating all_blocks data 15621 1726882631.67028: done generating all_blocks data 15621 1726882631.67029: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15621 1726882631.67030: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15621 1726882631.67031: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15621 1726882631.67313: done processing included file 15621 1726882631.67315: iterating over new_blocks loaded from include file 15621 1726882631.67316: in VariableManager get_vars() 15621 1726882631.67324: done with get_vars() 15621 1726882631.67326: filtering new block on tags 15621 1726882631.67337: done filtering new block on tags 15621 1726882631.67339: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 15621 1726882631.67342: extending task lists for all hosts with included blocks 15621 1726882631.67361: done extending task lists 15621 1726882631.67362: done processing included files 15621 1726882631.67363: results queue empty 15621 1726882631.67363: checking for any_errors_fatal 15621 1726882631.67364: done checking for any_errors_fatal 15621 1726882631.67364: checking for max_fail_percentage 15621 1726882631.67365: done checking for max_fail_percentage 15621 1726882631.67366: checking to see if all hosts have failed and the running result is not ok 15621 1726882631.67366: done checking to see if all hosts have failed 15621 1726882631.67367: getting the remaining hosts for this loop 15621 1726882631.67367: done getting the remaining hosts for this loop 15621 1726882631.67369: getting the next task for host managed_node3 15621 1726882631.67371: done getting next task for host managed_node3 15621 1726882631.67373: ^ task is: TASK: Check routes and DNS 15621 1726882631.67376: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882631.67378: getting variables 15621 1726882631.67378: in VariableManager get_vars() 15621 1726882631.67383: Calling all_inventory to load vars for managed_node3 15621 1726882631.67385: Calling groups_inventory to load vars for managed_node3 15621 1726882631.67386: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882631.67390: Calling all_plugins_play to load vars for managed_node3 15621 1726882631.67392: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882631.67393: Calling groups_plugins_play to load vars for managed_node3 15621 1726882631.68197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882631.69327: done with get_vars() 15621 1726882631.69342: done getting variables 15621 1726882631.69371: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:37:11 -0400 (0:00:00.058) 0:01:03.773 ****** 15621 1726882631.69395: entering _queue_task() for managed_node3/shell 15621 1726882631.69628: worker is 1 (out of 1 available) 15621 1726882631.69642: exiting _queue_task() for managed_node3/shell 15621 1726882631.69654: done queuing things up, now waiting for results queue to drain 15621 1726882631.69656: waiting for pending results... 15621 1726882631.69829: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 15621 1726882631.69905: in run() - task 0affc7ec-ae25-af1a-5b92-00000000057e 15621 1726882631.69918: variable 'ansible_search_path' from source: unknown 15621 1726882631.69921: variable 'ansible_search_path' from source: unknown 15621 1726882631.69953: calling self._execute() 15621 1726882631.70026: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882631.70030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882631.70040: variable 'omit' from source: magic vars 15621 1726882631.70326: variable 'ansible_distribution_major_version' from source: facts 15621 1726882631.70339: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882631.70345: variable 'omit' from source: magic vars 15621 1726882631.70380: variable 'omit' from source: magic vars 15621 1726882631.70404: variable 'omit' from source: magic vars 15621 1726882631.70442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15621 1726882631.70471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15621 1726882631.70486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15621 1726882631.70501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882631.70511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15621 1726882631.70541: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15621 1726882631.70545: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882631.70547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882631.70621: Set connection var ansible_connection to ssh 15621 1726882631.70630: Set connection var ansible_shell_executable to /bin/sh 15621 1726882631.70635: Set connection var ansible_timeout to 10 15621 1726882631.70639: Set connection var ansible_shell_type to sh 15621 1726882631.70648: Set connection var ansible_module_compression to ZIP_DEFLATED 15621 1726882631.70651: Set connection var ansible_pipelining to False 15621 1726882631.70671: variable 'ansible_shell_executable' from source: unknown 15621 1726882631.70678: variable 'ansible_connection' from source: unknown 15621 1726882631.70681: variable 'ansible_module_compression' from source: unknown 15621 1726882631.70684: variable 'ansible_shell_type' from source: unknown 15621 1726882631.70686: variable 'ansible_shell_executable' from source: unknown 15621 1726882631.70688: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882631.70691: variable 'ansible_pipelining' from source: unknown 15621 1726882631.70693: variable 'ansible_timeout' from source: unknown 15621 1726882631.70696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882631.70806: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882631.70814: variable 'omit' from source: magic vars 15621 1726882631.70820: starting attempt loop 15621 1726882631.70824: running the handler 15621 1726882631.70834: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15621 1726882631.70849: _low_level_execute_command(): starting 15621 1726882631.70856: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15621 1726882631.71410: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882631.71414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.71417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882631.71419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882631.71425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.71477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.71481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882631.71483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.71575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.73276: stdout chunk (state=3): >>>/root <<< 15621 1726882631.73386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882631.73441: stderr chunk (state=3): >>><<< 15621 1726882631.73444: stdout chunk (state=3): >>><<< 15621 1726882631.73467: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882631.73481: _low_level_execute_command(): starting 15621 1726882631.73486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115 `" && echo ansible-tmp-1726882631.734637-17913-244811436064115="` echo /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115 `" ) && sleep 0' 15621 1726882631.73952: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882631.73963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882631.73965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.73968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882631.73970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.74029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.74032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882631.74035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.74108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.76060: stdout chunk (state=3): >>>ansible-tmp-1726882631.734637-17913-244811436064115=/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115 <<< 15621 1726882631.76172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882631.76217: stderr chunk (state=3): >>><<< 15621 1726882631.76222: stdout chunk (state=3): >>><<< 15621 1726882631.76239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882631.734637-17913-244811436064115=/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882631.76266: variable 'ansible_module_compression' from source: unknown 15621 1726882631.76312: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15621x2kdpbzd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15621 1726882631.76351: variable 'ansible_facts' from source: unknown 15621 1726882631.76405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py 15621 1726882631.76510: Sending initial data 15621 1726882631.76514: Sent initial data (155 bytes) 15621 1726882631.76980: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882631.76983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.76987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882631.76990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found <<< 15621 1726882631.76992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.77040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.77044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.77133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.78758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15621 1726882631.78761: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15621 1726882631.78842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15621 1726882631.78936: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp8idn532y /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py <<< 15621 1726882631.78945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py" <<< 15621 1726882631.79017: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15621x2kdpbzd/tmp8idn532y" to remote "/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py" <<< 15621 1726882631.79768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882631.79831: stderr chunk (state=3): >>><<< 15621 1726882631.79835: stdout chunk (state=3): >>><<< 15621 1726882631.79851: done transferring module to remote 15621 1726882631.79861: _low_level_execute_command(): starting 15621 1726882631.79865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/ /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py && sleep 0' 15621 1726882631.80313: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882631.80317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882631.80319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.80327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882631.80330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882631.80332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.80380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.80383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.80472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882631.82654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882631.82702: stderr chunk (state=3): >>><<< 15621 1726882631.82705: stdout chunk (state=3): >>><<< 15621 1726882631.82719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882631.82725: _low_level_execute_command(): starting 15621 1726882631.82728: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/AnsiballZ_command.py && sleep 0' 15621 1726882631.83187: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882631.83192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found <<< 15621 1726882631.83196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration <<< 15621 1726882631.83198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882631.83248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882631.83258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882631.83347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882632.01033: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2872sec preferred_lft 2872sec\n inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:37:11.998982", "end": "2024-09-20 21:37:12.008081", "delta": "0:00:00.009099", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15621 1726882632.02658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. <<< 15621 1726882632.02829: stdout chunk (state=3): >>><<< 15621 1726882632.02833: stderr chunk (state=3): >>><<< 15621 1726882632.02837: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2872sec preferred_lft 2872sec\n inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:37:11.998982", "end": "2024-09-20 21:37:12.008081", "delta": "0:00:00.009099", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.226 closed. 15621 1726882632.02846: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15621 1726882632.02848: _low_level_execute_command(): starting 15621 1726882632.02851: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882631.734637-17913-244811436064115/ > /dev/null 2>&1 && sleep 0' 15621 1726882632.03507: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15621 1726882632.03516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15621 1726882632.03530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15621 1726882632.03545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15621 1726882632.03615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15621 1726882632.03659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' <<< 15621 1726882632.03674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15621 1726882632.03695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15621 1726882632.03808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15621 1726882632.05727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15621 1726882632.05828: stderr chunk (state=3): >>><<< 15621 1726882632.05832: stdout chunk (state=3): >>><<< 15621 1726882632.06028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.226 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.226 originally 10.31.45.226 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/9aa64530f0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15621 1726882632.06032: handler run complete 15621 1726882632.06035: Evaluated conditional (False): False 15621 1726882632.06037: attempt loop complete, returning result 15621 1726882632.06040: _execute() done 15621 1726882632.06042: dumping result to json 15621 1726882632.06044: done dumping result, returning 15621 1726882632.06046: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affc7ec-ae25-af1a-5b92-00000000057e] 15621 1726882632.06048: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000057e 15621 1726882632.06130: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000057e 15621 1726882632.06134: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009099", "end": "2024-09-20 21:37:12.008081", "rc": 0, "start": "2024-09-20 21:37:11.998982" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:19:da:ea:a3:f3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.226/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2872sec preferred_lft 2872sec inet6 fe80::19:daff:feea:a3f3/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.226 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.226 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 15621 1726882632.06224: no more pending results, returning what we have 15621 1726882632.06229: results queue empty 15621 1726882632.06230: checking for any_errors_fatal 15621 1726882632.06232: done checking for any_errors_fatal 15621 1726882632.06233: checking for max_fail_percentage 15621 1726882632.06235: done checking for max_fail_percentage 15621 1726882632.06236: checking to see if all hosts have failed and the running result is not ok 15621 1726882632.06238: done checking to see if all hosts have failed 15621 1726882632.06239: getting the remaining hosts for this loop 15621 1726882632.06240: done getting the remaining hosts for this loop 15621 1726882632.06250: getting the next task for host managed_node3 15621 1726882632.06256: done getting next task for host managed_node3 15621 1726882632.06264: ^ task is: TASK: Verify DNS and network connectivity 15621 1726882632.06267: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882632.06271: getting variables 15621 1726882632.06273: in VariableManager get_vars() 15621 1726882632.06306: Calling all_inventory to load vars for managed_node3 15621 1726882632.06309: Calling groups_inventory to load vars for managed_node3 15621 1726882632.06313: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882632.06472: Calling all_plugins_play to load vars for managed_node3 15621 1726882632.06477: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882632.06481: Calling groups_plugins_play to load vars for managed_node3 15621 1726882632.08450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882632.10607: done with get_vars() 15621 1726882632.10642: done getting variables 15621 1726882632.10717: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:37:12 -0400 (0:00:00.413) 0:01:04.186 ****** 15621 1726882632.10754: entering _queue_task() for managed_node3/shell 15621 1726882632.11177: worker is 1 (out of 1 available) 15621 1726882632.11191: exiting _queue_task() for managed_node3/shell 15621 1726882632.11203: done queuing things up, now waiting for results queue to drain 15621 1726882632.11430: waiting for pending results... 15621 1726882632.11636: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 15621 1726882632.11656: in run() - task 0affc7ec-ae25-af1a-5b92-00000000057f 15621 1726882632.11679: variable 'ansible_search_path' from source: unknown 15621 1726882632.11687: variable 'ansible_search_path' from source: unknown 15621 1726882632.11734: calling self._execute() 15621 1726882632.11837: variable 'ansible_host' from source: host vars for 'managed_node3' 15621 1726882632.11849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15621 1726882632.11872: variable 'omit' from source: magic vars 15621 1726882632.12289: variable 'ansible_distribution_major_version' from source: facts 15621 1726882632.12315: Evaluated conditional (ansible_distribution_major_version != '6'): True 15621 1726882632.12472: variable 'ansible_facts' from source: unknown 15621 1726882632.13453: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 15621 1726882632.13502: when evaluation is False, skipping this task 15621 1726882632.13505: _execute() done 15621 1726882632.13508: dumping result to json 15621 1726882632.13510: done dumping result, returning 15621 1726882632.13512: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affc7ec-ae25-af1a-5b92-00000000057f] 15621 1726882632.13515: sending task result for task 0affc7ec-ae25-af1a-5b92-00000000057f skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 15621 1726882632.13783: no more pending results, returning what we have 15621 1726882632.13787: results queue empty 15621 1726882632.13789: checking for any_errors_fatal 15621 1726882632.13801: done checking for any_errors_fatal 15621 1726882632.13802: checking for max_fail_percentage 15621 1726882632.13804: done checking for max_fail_percentage 15621 1726882632.13805: checking to see if all hosts have failed and the running result is not ok 15621 1726882632.13806: done checking to see if all hosts have failed 15621 1726882632.13807: getting the remaining hosts for this loop 15621 1726882632.13808: done getting the remaining hosts for this loop 15621 1726882632.13814: getting the next task for host managed_node3 15621 1726882632.13832: done getting next task for host managed_node3 15621 1726882632.13836: ^ task is: TASK: meta (flush_handlers) 15621 1726882632.13838: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882632.13841: getting variables 15621 1726882632.13843: in VariableManager get_vars() 15621 1726882632.13875: Calling all_inventory to load vars for managed_node3 15621 1726882632.13878: Calling groups_inventory to load vars for managed_node3 15621 1726882632.13882: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882632.13940: done sending task result for task 0affc7ec-ae25-af1a-5b92-00000000057f 15621 1726882632.13944: WORKER PROCESS EXITING 15621 1726882632.13959: Calling all_plugins_play to load vars for managed_node3 15621 1726882632.13962: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882632.13965: Calling groups_plugins_play to load vars for managed_node3 15621 1726882632.15846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882632.18166: done with get_vars() 15621 1726882632.18195: done getting variables 15621 1726882632.18269: in VariableManager get_vars() 15621 1726882632.18279: Calling all_inventory to load vars for managed_node3 15621 1726882632.18282: Calling groups_inventory to load vars for managed_node3 15621 1726882632.18284: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882632.18289: Calling all_plugins_play to load vars for managed_node3 15621 1726882632.18292: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882632.18295: Calling groups_plugins_play to load vars for managed_node3 15621 1726882632.19736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882632.21833: done with get_vars() 15621 1726882632.21860: done queuing things up, now waiting for results queue to drain 15621 1726882632.21862: results queue empty 15621 1726882632.21863: checking for any_errors_fatal 15621 1726882632.21866: done checking for any_errors_fatal 15621 1726882632.21866: checking for max_fail_percentage 15621 1726882632.21867: done checking for max_fail_percentage 15621 1726882632.21868: checking to see if all hosts have failed and the running result is not ok 15621 1726882632.21869: done checking to see if all hosts have failed 15621 1726882632.21869: getting the remaining hosts for this loop 15621 1726882632.21870: done getting the remaining hosts for this loop 15621 1726882632.21873: getting the next task for host managed_node3 15621 1726882632.21877: done getting next task for host managed_node3 15621 1726882632.21878: ^ task is: TASK: meta (flush_handlers) 15621 1726882632.21880: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882632.21883: getting variables 15621 1726882632.21888: in VariableManager get_vars() 15621 1726882632.21897: Calling all_inventory to load vars for managed_node3 15621 1726882632.21899: Calling groups_inventory to load vars for managed_node3 15621 1726882632.21902: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882632.21907: Calling all_plugins_play to load vars for managed_node3 15621 1726882632.21909: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882632.21912: Calling groups_plugins_play to load vars for managed_node3 15621 1726882632.23465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882632.25595: done with get_vars() 15621 1726882632.25618: done getting variables 15621 1726882632.25682: in VariableManager get_vars() 15621 1726882632.25693: Calling all_inventory to load vars for managed_node3 15621 1726882632.25696: Calling groups_inventory to load vars for managed_node3 15621 1726882632.25698: Calling all_plugins_inventory to load vars for managed_node3 15621 1726882632.25704: Calling all_plugins_play to load vars for managed_node3 15621 1726882632.25707: Calling groups_plugins_inventory to load vars for managed_node3 15621 1726882632.25710: Calling groups_plugins_play to load vars for managed_node3 15621 1726882632.27202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15621 1726882632.30382: done with get_vars() 15621 1726882632.30412: done queuing things up, now waiting for results queue to drain 15621 1726882632.30415: results queue empty 15621 1726882632.30415: checking for any_errors_fatal 15621 1726882632.30417: done checking for any_errors_fatal 15621 1726882632.30418: checking for max_fail_percentage 15621 1726882632.30419: done checking for max_fail_percentage 15621 1726882632.30420: checking to see if all hosts have failed and the running result is not ok 15621 1726882632.30421: done checking to see if all hosts have failed 15621 1726882632.30424: getting the remaining hosts for this loop 15621 1726882632.30425: done getting the remaining hosts for this loop 15621 1726882632.30435: getting the next task for host managed_node3 15621 1726882632.30439: done getting next task for host managed_node3 15621 1726882632.30440: ^ task is: None 15621 1726882632.30446: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15621 1726882632.30447: done queuing things up, now waiting for results queue to drain 15621 1726882632.30449: results queue empty 15621 1726882632.30449: checking for any_errors_fatal 15621 1726882632.30450: done checking for any_errors_fatal 15621 1726882632.30451: checking for max_fail_percentage 15621 1726882632.30452: done checking for max_fail_percentage 15621 1726882632.30453: checking to see if all hosts have failed and the running result is not ok 15621 1726882632.30454: done checking to see if all hosts have failed 15621 1726882632.30455: getting the next task for host managed_node3 15621 1726882632.30458: done getting next task for host managed_node3 15621 1726882632.30459: ^ task is: None 15621 1726882632.30460: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Friday 20 September 2024 21:37:12 -0400 (0:00:00.197) 0:01:04.384 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 3.53s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 2.57s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.53s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 2.52s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 2.46s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 2.46s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.44s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.43s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 fedora.linux_system_roles.network : Check which services are running ---- 2.41s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.36s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 2.36s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gather the minimum subset of ansible_facts required by the network role test --- 2.21s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Gathering Facts --------------------------------------------------------- 2.19s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 2.13s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 fedora.linux_system_roles.network : Check which packages are installed --- 1.84s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.77s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 1.58s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Create veth interface lsr27 --------------------------------------------- 1.47s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.36s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 15621 1726882632.30582: RUNNING CLEANUP